<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mandeep Singh Gulati</title>
    <description>The latest articles on DEV Community by Mandeep Singh Gulati (@mandeepm91).</description>
    <link>https://dev.to/mandeepm91</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mandeepm91"/>
    <language>en</language>
    <item>
      <title>How to navigate this seemingly complex world of tech as an absolute beginner?</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Sun, 25 Jul 2021 15:16:52 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-navigate-this-seemingly-complex-world-of-tech-as-an-absolute-beginner-43hh</link>
      <guid>https://dev.to/mandeepm91/how-to-navigate-this-seemingly-complex-world-of-tech-as-an-absolute-beginner-43hh</guid>
      <description>&lt;p&gt;&lt;em&gt;This post is aimed at helping those people who are thinking of entering the world of tech and start a career as a software developer, but are too overwhelmed by the information available to them and finding it hard where to start.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We will be taking a 10,000 feet view of the world of tech and various career paths you can explore as a beginner. And while doing so, we will answer some of the questions like&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What are programming languages and which one to learn?&lt;/li&gt;
&lt;li&gt;What's web development?&lt;/li&gt;
&lt;li&gt;What's a web server?&lt;/li&gt;
&lt;li&gt;What's cloud?&lt;/li&gt;
&lt;li&gt;What's HTML, CSS and JavaScript?&lt;/li&gt;
&lt;li&gt;What's Python?&lt;/li&gt;
&lt;li&gt;What's front-end, back-end and full-stack?&lt;/li&gt;
&lt;li&gt;What are frameworks?&lt;/li&gt;
&lt;li&gt;What's Artificial Intelligence and Machine Learning?&lt;/li&gt;
&lt;li&gt;What are databases?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you feel overwhelmed already, don't worry. I have over-simplified things to help you get started. Once you dive deeper into any of these questions, there's a lot of learn but think of this article as a starting point. The intent of this post is to provide substantial information to help you get started without overwhelming you as a beginner. So, if you feel like something in this post is too complex and difficult to grasp, feel free to let me know. I would like to update this post based on feedback to make it as beginner friendly as possible.&lt;/p&gt;

&lt;p&gt;So, first things first&lt;/p&gt;

&lt;h3&gt;
  
  
  What are programming languages and which one to learn?
&lt;/h3&gt;

&lt;p&gt;Being a software developer is all about talking to machines in a language they understand. Different types of languages serve different purpose. For example,&lt;/p&gt;

&lt;h3&gt;
  
  
  Markup language like HTML
&lt;/h3&gt;

&lt;p&gt;HTML allows you to define the structure of a web page. What does it mean? Let's take an example of Wikipedia. When you visit &lt;a href="http://www.wikipedia.org"&gt;www.wikipedia.org&lt;/a&gt;, you can see a page like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W_6qED_J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1626577039396/LJWV9-7jA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W_6qED_J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1626577039396/LJWV9-7jA.png" alt="image.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you see the structure of the page, there is a sidebar on the left which includes links like "Main page", "Contents", "Current events", etc. There is a top navigation bar as well with tabs like "Main page", "Talk", "Read", "View source", "View history" and a search bar.&lt;/p&gt;

&lt;p&gt;Below that there is main content section which is further divided into blocks like "Welcome to Wikipedia", "From today's featured article" and "In the news". &lt;br&gt;
All of this information about the structure of this web page is defined using a markup language called HTML which stands for Hyper Text Markup Language. To get a feel of it and play around, you can visit  &lt;a href="https://www.w3schools.com/html/html_intro.asp"&gt;w3schools&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;That covers the introduction to markup language. Another type of language is&lt;/p&gt;
&lt;h3&gt;
  
  
  Style sheet language like CSS (Cascading style sheet)
&lt;/h3&gt;

&lt;p&gt;Now we know that using HTML, we can structure a web page. But things like controlling the presentation of a web page, for example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What should be the width of our sidebar element?&lt;/li&gt;
&lt;li&gt;What should be the background color of our "In the news" and "From today's featured article" be?&lt;/li&gt;
&lt;li&gt;What should be the color, font size of headings, paragraphs, links, etc.?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of this information and a lot of other information is controlled via CSS. You can also create complex graphics and animations using CSS as well but that's very advanced level stuff. As beginners, we'll only need to use it for controlling the basic stuff. To get a feel of it and play around, you can visit  &lt;a href="https://www.w3schools.com/css/default.asp"&gt;w3schools&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;So far, we covered one markup language and one style sheet language. There are other markup languages other than HTML and there are other style sheet languages other than CSS, but for web development, HTML and CSS are the only markup and style sheet languages you'll ever need to learn.&lt;/p&gt;

&lt;p&gt;Next, we will cover programming languages&lt;/p&gt;
&lt;h3&gt;
  
  
  Programming languages like (JavaScript, Python, C#, Go, Java, PHP, etc.)
&lt;/h3&gt;

&lt;p&gt;These languages allow you to run logical computations. For example, when you try logging into to a site, for example Twitter, when you fill your email address and password on a webpage and click the  "Log in" button, your browser sends a request to Twitter's server. On the server, there is a computer program which checks your credentials, looks up the credentials in Twitter's database, checks if the password you provided is correct or not and then returns success/failure response back to the browser.&lt;br&gt;
Once you are logged in, your browser sends another request to fetch your tweets, notifications, etc and this calls another computer program on Twitter's servers which fetches this information from their database based on your user id and returns the same back to the browser.&lt;br&gt;
All of this interaction happening between your browser and the server is written using programming languages. There is code written on the browser side as well as server side. Some of the code that executes on browser:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Sending of the request to server with user credentials when login button is clicked&lt;/li&gt;
&lt;li&gt;Sending of the request to server to fetch user's tweets/notifications, etc&lt;/li&gt;
&lt;li&gt;Sending a request to the server to like a tweet or post a comment&lt;/li&gt;
&lt;li&gt;Tracking user behavior like scrolling, clicking, etc. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of this code that executes on the browser is written in a programming language called JavaScript (Don't confuse it with another, completely different programming language called Java. There is no similarity between Java and JavaScript except their names). Browsers can only execute JavaScript code and not any other programming language code. &lt;br&gt;
So, if you want to become a front-end developer, which means a software developer responsible for &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Structuring the layout of a web-page (using HTML)&lt;/li&gt;
&lt;li&gt;Defining the presentation of the web-page (using CSS)&lt;/li&gt;
&lt;li&gt;And writing logic for interactions on a web page (using JavaScript)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Then the above three languages are something you'll need to learn.  The sequence in which you choose to learn above languages is up to you. You may start with HTML then learn CSS and then dive into JavaScript. Or you can start directly with JavaScript core language and then learn HTML and CSS and then learn how to use JavaScript for browser based interactions. There is no right or wrong approach to learn these 3 languages but know that you will need to learn all 3 in order to become a front-end developer. To be employable as a front-end developer, you will need to expand your knowledge further by learning a framework like Vue, React, Angular, Svelte, etc. Will will cover these frameworks later in this article. But for now, you need to know that HTML, CSS and JavaScript are necessary skills for a front-end developer. Sorry for being repetitive. &lt;/p&gt;

&lt;p&gt;So, we know that browsers can only execute JavaScript language. However, there is no such restriction when our code does not have to execute on a browser.&lt;/p&gt;

&lt;p&gt;When we have to run some piece of code that does not need to run on a web browser, we have a lot of freedom to choose from the various programming languages available. For example, the operating system you are using like Windows, Linux or Mac is a collection of computer programs written in some programming language. The web browser itself is also a collection of various computer programs written in some programming languages. Every software you use is a collection of computer programs written in a programming language. Theoretically, anything you can do in one programming language can also be done in any other programming language. However, practically, some languages are better suited for certain tasks than others, due to various factors like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ease of use&lt;/li&gt;
&lt;li&gt;performance&lt;/li&gt;
&lt;li&gt;tools and community built around that language&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You might have already heard this or read this before but let me re-iterate that:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Learning one programming language makes it relatively easier to learn another programming language&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;So, if you know C#, learning other languages like Python, JavaScript, Java, Go, etc. becomes relatively easier. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;But which one to choose first?&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;It depends on which career path you want to pursue? Of course if you are an absolute beginner, you most probably don't know which career path you want to choose. So, I'll share some of the popular career paths and how they influence which languages to learn. Before I do that, however, I would like to make a suggestion if you don't want to decide which career path to pursue at this point of your journey. If you're an absolute beginner who wants to learn programming, I would recommend starting with Python. In my opinion, it is the most beginner friendly language, has a huge community and can be used in various domains, especially in the field of data science. Having said that, if you want to choose the programming language to learn based on a career path, here are some of the popular career paths you can explore in tech.&lt;/p&gt;
&lt;h3&gt;
  
  
  Front-end engineer
&lt;/h3&gt;

&lt;p&gt;We've already talked about this path earlier and we know that in order to become a front-end engineer one must learn HTML, CSS and JavaScript and then learn some of the popular frameworks like Vue, React, Angular, Svelte, etc. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But, what is a framework?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Think of it like a collection of programs written by someone else that you can include in your project to simplify the process of building your web application. The term web application is used interchangeably with the term website. A framework provides you with some abstractions on top of the programming language. As an over-simplification, you may think of a framework as a new language in itself but technically a framework is not a new language. Usually the way framework helps is by allowing you to write less code to accomplish something than you would have to if you were not using a framework. It also, makes you worry about less stuff by automatically taking care of a lot of things out of the box. You don't have to learn every framework mentioned above. Any of the ones I mentioned above will do. I would advise you to do some re-search regarding the job market in your location to see which of these frameworks is in demand. Personally, I liked both React and Vue with more inclination towards Vue because I found it relatively easier to learn. I haven't tried Svelte yet but I've heard good things about it. Whichever framework you pick, I would suggest you stick with it and build something using it before jumping on to learn another framework. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;In order to be a good software developer, you need to build things. This is the fastest way to learn. See a tutorial, build something using what you've learned. If you don't apply your knowledge, it will diminish within days or weeks. By building something, whatever you learn will stick for a very long time&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;These frameworks that we talked about earlier are all JavaScript frameworks. There are frameworks for CSS as well, Bootstrap being probably the most popular one. There are others like Tailwind, Foundation, etc. but I don't know much about them and once you gain some basic CSS skills and if you feel like becoming CSS expert is what you want to do, you may explore those frameworks and choose whichever you like. &lt;/p&gt;
&lt;h3&gt;
  
  
  Mobile app developers
&lt;/h3&gt;

&lt;p&gt;Mobile app development is another very popular career path that a lot of engineers pursue. The world of mobile app development is very vast but mobile apps can be broadly classified into two categories:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Native apps&lt;/li&gt;
&lt;li&gt;Hybrid apps&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Native apps are written in languages specific to the platform they run on. For example, for iOS, native apps are written in a programming language called Swift. So, if you wish to become an iOS app developer, Swift is the programming language you'll need to learn. For Android, the native mobile apps are written in Java (not JavaScript). So, if you want to become an android developer, you'll need to learn Java. Do note however that for an absolute beginner, learning Java or Swift can be quite challenging so don't be discouraged if you find it very difficult in the beginning. &lt;/p&gt;

&lt;p&gt;Hybrid apps on the other hand are web apps. They're created using frameworks like React-Native, Vue-Native, NativeScript, Ionic, etc. These are like normal web application that you create using HTML, CSS and JavaScript but they can be packaged into mobile applications that can be installed on both iOS and android. So, this career path has a lot of overlap with that of a front-end web developer. &lt;/p&gt;
&lt;h3&gt;
  
  
  Backend engineer
&lt;/h3&gt;

&lt;p&gt;When we talk about backend engineer, we usually refer to the people who write server side code that interacts with the website. Taking our earlier example of Twitter, the code that validates the user's credential on the server side and the code that fetches user's information like their tweets, notifications, like, etc from the database is written by a backend engineer. This code executes on server side. But, what is a server? &lt;/p&gt;

&lt;p&gt;As an over-simplification, think of it as a computer program running on a BIG computer located in a secure location. &lt;/p&gt;

&lt;p&gt;The code written on the client side (running in a browser) interacts with this computer program on the server by making network requests. So, if your internet connection is down, the browser code cannot interact with the server and your interaction with the web site breaks. These network requests between the client and server follow a specific protocol for communication called HTTP which stands for Hyper Text Transfer Protocol. The HTTP protocol dictates a set of rules that must be followed by the client and server to communicate with each other. &lt;br&gt;
The code running in the browser sends requests to the server and the server returns response. These requests and responses are messages that must be in a specific format and this format is dictated by the HTTP protocol. For example, in order to fetch user's notifications, the browser code will make an HTTP request like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GET /user/notifications
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is just an oversimplified example of what a request could be and not the actual request being sent in the Twitter web application. Also, this is one small portion a request to give you an idea of how HTTP requests are identified by their verbs like "GET", "POST", "PUT", etc. and the URL which is &lt;code&gt;/user/notifications&lt;/code&gt; in this case&lt;/p&gt;

&lt;p&gt;For posting a tweet, the browser code would send some request like (again, this is just a hypothetical example and not at all the actual representation of the Twitter web app)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POST /tweets
{
   "content": "I just wrote my first blog on HashNode"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Usually, when we have to perform a write operation (anything that modifies some data in the database), for example, creating a tweet, liking a tweet, etc, we use a POST request and when we have to perform a read operation, like fetching notifications, followers, etc. we use a GET request. This is not enforced and you could use GET or POST any way you like but in general, we do it this way. There are other verbs like PUT, PATCH, DELETE as well but we won't go into the details of those in this article. &lt;br&gt;
POST requests usually have request body associated with them. In the above example, the request body is enclosed in curly braces &lt;code&gt;{}&lt;/code&gt;. This kind of representation of data is called JSON (stands for JavaScript Object Notation) and it is a very popular way of sending data between client and server.&lt;/p&gt;

&lt;p&gt;So, now that we we know that the client code sends HTTP request to communicate with the server, let's talk about how the server code handles these requests.&lt;/p&gt;

&lt;p&gt;On the server side, we need to write handlers for these HTTP requests. So, it's like:&lt;/p&gt;

&lt;p&gt;Perform a certain action when we receive a &lt;code&gt;GET&lt;/code&gt; request on the url &lt;code&gt;/notifications&lt;/code&gt;. The HTTP requests also contain a lot of other information about user session, browser, etc which helps the server code identify where the request is coming from and which user is requesting this information. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;These HTTP request handlers in the server code which provide sort of an interface between the client and server, are called **API (Application Programming Interface)&lt;/em&gt;*. &lt;br&gt;
*&lt;br&gt;
In these request handlers, a lot of stuff can happen like &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;authorization logic to check if the user is authorized to make this request&lt;/li&gt;
&lt;li&gt;reading from/writing to a database&lt;/li&gt;
&lt;li&gt;running some sort of a computation&lt;/li&gt;
&lt;li&gt;calling another external service&lt;/li&gt;
&lt;li&gt;and there are many other possibilities&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This code can be written in any programming language. Some of the popular choices are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;JavaScript&lt;/li&gt;
&lt;li&gt;TypeScript&lt;/li&gt;
&lt;li&gt;Python&lt;/li&gt;
&lt;li&gt;C#&lt;/li&gt;
&lt;li&gt;Go&lt;/li&gt;
&lt;li&gt;Java&lt;/li&gt;
&lt;li&gt;PHP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are other languages as well which can be used and I've only listed the popular ones that I could recall. But learning a language is often not sufficient for becoming a backend engineer. One needs to have a good understanding of databases as well. &lt;/p&gt;

&lt;p&gt;*** What is a database you may ask? ***&lt;/p&gt;

&lt;p&gt;In very simple terms, it is a software that allows you to store and retrieve data in efficient manner. Take an example of your cellphone's phonebook app. The phonebook app maintains it's own internal database in which it stores all the contacts with phone numbers, first name, last name, social media profile information, etc and allows you to search through the contacts in an efficient manner. Real world databases can be very complex, containing information about various entities. For example, if you were to build a Twitter clone, you would need to store information about users, their tweets, likes, followers, people they follow, bookmarks, lists, etc. A database management system (DBMS) allows us to store and retrieve such inter-connected information in an efficient manner. DBMS can be broadly classified into two categories:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Relational database management systems (RDBMS). These are databases where information about different entities is stored in different tables. For example, say we use a users table to store all the users, a tweets table to store all the tweets, a notifications table to store all the notifications. And these tables have references between each other. For example, the tweet table can contain a user id column which is a reference to the users table. Interaction with RDBMS (storing, retrieval of data, creation and modification of tables, etc) is supported through another programming language called Structured Query Language (SQL, also pronounced as Sequel). SQL is very different from other programming languages we've discussed earlier for example Python, Java, C#, JavaScript, etc. Some find it easy, some find it hard but as a backend developer, it is an essential skill to have and good SQL knowledge helps you write efficient code. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Non relational databases (also called NoSQL databases). The challenge with many relational databases is that it becomes expensive to scale them up when data grows (think of Terabytes of data). In order to solve this problem, many NoSQL databases came into existence. Some of the examples are MongoDB, Redis, Aerospike, Cassandra, CouchDB, Elasticsearch, DynamoDB, Neo4j. These databases don't support SQL and have either their own domain specific language or provide drivers in various programming languages to interact with them. Which ones to use depends purely on the type of problem you're trying to solve and we won't be going into the specifics of each. Also, which one to learn purely depends on the type of jobs you're applying for. But if you want to start on a career path of a backend developer, I would recommend starting with learning a programming language for writing backend code along with SQL for interacting with relational DBMS. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We've talked about front-end developer responsible for building the browser side of a web application and back-end developer responsible for building the server side of a web application. For small scale startups, it's often economical to hire engineers who know both of these worlds and can wear many hats depending on the need. There is a term for such web developers and it's called a "Full-stack developer". Most of the full stack developers start with one end of the stack (back-end or front-end) and gradually acquire the skills for the other end. &lt;/p&gt;

&lt;p&gt;For example, someone may start with learning HTML, CSS and JavaScript to become front-end engineer and then since JavaScript can also be used in backend (using Nodejs), they may explore that territory, learn about databases, caching and other programming concept to become full stack developer. &lt;/p&gt;

&lt;p&gt;In my case, it was the opposite. I spent first 4 years of my career interacting with databases, writing complex SQL code. Then I ventured into the world of backend API development using JavaScript (Nodejs) and gradually learned VueJS (since I already knew JavaScript) and then HTML and CSS. During all these years, I also learned other languages like Python, Go and C# and frontend frameworks like React which helped me become more resourceful as a full stack developer.&lt;/p&gt;

&lt;p&gt;So, there is no right or wrong path. Pick whatever interests you and gradually expand further over time. &lt;/p&gt;

&lt;p&gt;So far, we've discussed the career paths primarily related to web development and mobile app development. The world of tech is a lot more vast than that. Let's talk about some other popular career paths&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Scientist
&lt;/h3&gt;

&lt;p&gt;This is a career path which has grown significantly in popularity over the last decade (2010 - 2020) and most probably will continue to grow for at least another decade. But what do these people do?&lt;/p&gt;

&lt;p&gt;In the technology world, data is everything because it gives the companies an insight into the minds of it's customers. Companies collect massive amount of data about their customers. Taking example of Twitter, all your interactions are captured like what kind of tweets you like or engage with more often, what kind of tweets you re-tweet, your scrolling behavior, your browsing behavior, etc. All of this data is then used by companies to customize their platform as per your profile. For example, Twitter showing you more of the tweets that you're likely to engage with, YouTube showing you more of the videos based on your past behavior, Netflix showing you movie recommendations based on your watch history, etc.&lt;/p&gt;

&lt;p&gt;All the analysis that goes behind generating such recommendations is the job of a data scientist. They run algorithms  on large volumes of data to generate such outcomes. These algorithms involve mathematics and statistics. Domains like "machine learning" and "artificial intelligence" are a part of the data science world. &lt;/p&gt;

&lt;p&gt;Customization of a platform based on user's preferences is just one of the applications of data driven decision making. There are a ton of other applications. For example, credit card companies analyzing their customer's spending patterns to identify customers that are likely to commit a fraud or customers that are likely to default on their payments. Identifying such customers allows a company to mitigate risk proactively.  &lt;/p&gt;

&lt;p&gt;If it sounds exciting, it is! And if you wish to start on this path, learning Python is highly recommended as the starting point due to a wide range of tools and a huge community of data science built around it. Apart from that, learning SQL is also recommended. &lt;/p&gt;

&lt;h3&gt;
  
  
  DevOps Engineers or Cloud Engineers
&lt;/h3&gt;

&lt;p&gt;This is another field that has grown in popularity over the last decade. Earlier we talked about various aspects of web development like creating the front-end application using HTML, CSS and JavaScript and creating the back-end APIs using one of the various programming languages. When you are building these applications, you would work on your own computers or laptops and test everything on your laptop or computer. Once you've tested it and feel confident that it is ready to be shipped and shared with the rest of the world, you will need to deploy it to the web. All the devices on the internet have an address called an IP address. It looks like this: "52.145.234.98" (four number separated by dots. Each of these numbers can be between 0 - 255). These addresses are called IPv4 addresses. There is another class of addresses called IPv6 addresses but we don't need to get into the details of that. You need to know that devices on the internet (like your laptop) have an IP address and other devices can reach out to you using this address. &lt;/p&gt;

&lt;p&gt;Now technically, you can host your website on your machine and ask others to type your IP address on their web browsers and they should be able to see your website. You will need to have a static IP address for this to work. You can read more about static and dynamic IP addresses  &lt;a href="https://www.geeksforgeeks.org/difference-between-static-and-dynamic-ip-address/"&gt;here&lt;/a&gt;. You can also purchase a domain from any of the domain registrars like Namecheap, GoDaddy, etc. and map that domain to your IP address. For example, let's say your static IP address is 52.145.234.98 and you purchase a domain name (&lt;a href="http://www.mysupercooldomain.com"&gt;www.mysupercooldomain.com&lt;/a&gt;) and configure it to map to your IP address. Then, if someone types &lt;a href="http://www.mysupercooldomain.com"&gt;http://www.mysupercooldomain.com&lt;/a&gt; on their web browser, they should see your website if your website is up and running on your laptop. While it is possible to host your website from your own laptop, it is not practical to do so. Because once you shutdown your laptop, your website will be unavailable. Also, as more and more users use your website, it will increase the load on your laptop and slow it down. In a real world scenario, we need our application to be resilient to such failures. Companies like Amazon, Google and Microsoft provide us the option of deploying our applications on their infrastructure. Amazon offers Amazon Web Services (AWS), Google offers Google Cloud Platform (GCP) and Microsoft offers Microsoft Azure. These companies have massive data centers which are bigger than a football field and contain a lot of computers. These data centers are also commonly referred to as "the cloud" (and this has got nothing to do with actual clouds responsible for rain :D). &lt;/p&gt;

&lt;p&gt;As a funny aside, I would like to share  &lt;a href="https://www.youtube.com/watch?v=BgJVo_zndhw"&gt;this video&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Again as an oversimplification, using these cloud offerings like AWS, GCP or Azure, we can deploy our applications on one of those computers and pay for the size of the computer we use. Big corporations often have their own data centers and don't require to rely on these cloud providers to deploy their applications. However, startups and smaller organizations benefit greatly from these cloud providers like AWS, GCP, Azure etc since they don't have to be concerned about provisioning of the infrastructure for their application and can easily scale up/down based on their requirements and budget. &lt;/p&gt;

&lt;p&gt;The job of a DevOps engineer or a cloud engineer is to take your application and deploy it on one of these cloud providers. They're responsible for setting up the architecture to ensure our application is resilient to failures and can scale up with rising traffic. These engineers often have solid understanding of engineering principles, strong knowledge of distributed computing and expertise with one or more of various cloud providers like AWS, GCP, Azure, etc. They're also proficient with the usage of command line (terminal on Linux and Mac and PowerShell on Windows). &lt;/p&gt;

&lt;p&gt;As a beginner, starting with this path can be challenging and quite overwhelming. But if you want to start, I would recommend getting comfortable with the command line (preferably Linux or Mac terminal) and starting with beginner level YouTube videos or Udemy course on either of the cloud providers (preferably AWS).&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;The world of tech is very vast and we've only scratched the surface. But I hope that this post helped you understand what kind of work various software developers do. There are a lot of other roles as well like a system administrator (SysAdmin), database engineer, database administrator, network engineer, etc. and a lot more specialized roles that I haven't mentioned in this post. But I hope that after reading through this post, you feel a little less overwhelmed to explore the world of tech as a beginner. &lt;/p&gt;

&lt;p&gt;If this post helped you, share it so that it can help other beginners as well. And if you feel I should change something to improve it, let me know. Since it is aimed at beginners, my intent is to make it as beginner friendly as possible and that can be accomplished only when more and more beginners provide some constructive feedback on improving this post. &lt;/p&gt;

&lt;p&gt;Having said that, I wish you all the best on your journey ahead :-) &lt;/p&gt;

</description>
      <category>100daysofcode</category>
      <category>codenewbie</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How to connect your ExpressJS app with Postgres using Knex</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Sun, 17 Jan 2021 00:23:24 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-connect-your-expressjs-app-with-postgres-using-knex-76</link>
      <guid>https://dev.to/mandeepm91/how-to-connect-your-expressjs-app-with-postgres-using-knex-76</guid>
      <description>&lt;p&gt;&lt;em&gt;Note: I've created a video for this tutorial if you'd like to check that out &lt;a href="https://www.youtube.com/watch?v=xED_AMH-eBI&amp;amp;feature=youtu.be"&gt;here&lt;/a&gt;. Also, the supporting code for the same can be found &lt;a href="https://github.com/mandeepm91/express-postgres-knex-app"&gt;here&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Express is one of the most popular JavaScript framework for building backend APIs and Postgres is a really popular relational database. How do we connect the two? &lt;/p&gt;

&lt;p&gt;If you look at the official documentation for Express, you'll see the section like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var pgp = require('pg-promise')(/* options */)
var db = pgp('postgres://username:password@host:port/database')

db.one('SELECT $1 AS value', 123)
  .then(function (data) {
    console.log('DATA:', data.value)
  })
  .catch(function (error) {
    console.log('ERROR:', error)
  })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It works for sure but it's not the way you would write it in a full fledged production application. Some of the questions that come to mind are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How do you create the tables in the database?&lt;/li&gt;
&lt;li&gt;How do you track changes to the database? For example, when you alter a table or create a new table. Or create/drop an index on a field. How to keep track of all these changes in your git/cvs/svn repository? &lt;/li&gt;
&lt;li&gt;What if you switch from Postgres to some other database in future, say MariaDB for example? Do all your queries still work?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There might be a lot more questions but to me, the most important one feels like keeping track of changes to database in your application codebase. If someone clones my repository to their local system, they should have a command to create all the database tables on their local setup. Also, as we make changes to the database like adding/dropping tables or indices or altering any of the tables, one should be able to run a single command to sync their local copy of the database structure with the same on production DB. I am talking about structure, not the data. All the tables on the local database should have the same structure as that in the production database to make the testing of your application easy on local machine. And if you don't have this sync mechanism automated, you're likely to run into a lot of issues that you'll be troubleshooting in production. &lt;/p&gt;

&lt;p&gt;To solve for these problems, we have libraries like &lt;a href="http://knexjs.org/"&gt;Knex&lt;/a&gt; and &lt;a href="https://sequelize.org/"&gt;Sequelize&lt;/a&gt;. These libraries provide a very neat API for writing SQL queries which are database agnostic and prevent issues like SQL injection attacks. They also provide transaction support to handle complex DB operations and streaming API to handle large volumes of data in a script. Also, to keep track of structural changes to your database in your code repo, these libraries use the concept of migrations. Migrations are files where you write structural changes you want to make to your database. For example, let's say you have a users table and want to alter the table to add a new column &lt;strong&gt;gender&lt;/strong&gt;. You can write a Knex migration file like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exports.up = knex =&amp;gt; knex.schema
  .alterTable('users', (table) =&amp;gt; {
    table.string('gender')
  });

exports.down = knex =&amp;gt; knex.schema
  .alterTable('user', (table) =&amp;gt; {
    table.dropColumn('gender');
  });

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;strong&gt;up&lt;/strong&gt; function defines what to do when we run the migration and &lt;strong&gt;down&lt;/strong&gt; function defines what to do when we rollback the migration. You can run the migration like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;knex migrate:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And you can roll it back like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;knex migrate:rollback
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you commit this file to your code repository, your other team members can pull the changes from the repo and run these commands at their end to sync up the database structure on their machines.&lt;/p&gt;

&lt;p&gt;In order to keep track of the database changes (migrations), Knex creates a few extra tables which contain information about what all migrations have been applied. So, for example if one of your team members hasn't synced their database in a long time and there are say 10 new migration scripts added since the last time they synced, when the pull the latest changes from the repo and run the migration command, all those 10 migrations will be applied in the sequence they were added to the repository. &lt;/p&gt;

&lt;p&gt;Anyway, coming back to the main topic of this post. How do we add knex to our ExpressJS app and how do we use it to connect to our postgres database? Before we dive into this, there are some pre-requisites that should be met&lt;/p&gt;

&lt;h3&gt;
  
  
  Pre-Requisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Node JS version 8 or higher intalled&lt;/li&gt;
&lt;li&gt;Postgres installed on our localhost:5432&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Steps
&lt;/h3&gt;

&lt;p&gt;We will divide this article into following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creating the Express app&lt;/li&gt;
&lt;li&gt;Creating the API endpoint with some hard coded data&lt;/li&gt;
&lt;li&gt;Creating a database for our app&lt;/li&gt;
&lt;li&gt;Installing and configuring knex&lt;/li&gt;
&lt;li&gt;Populating seed data with knex&lt;/li&gt;
&lt;li&gt;Updating the API endpoint created in step 2 to fetch the data from database instead of returning hard coded data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For this tutorial, we will be using Ubuntu Linux but these instructions should work fine other operating systems as well. &lt;/p&gt;

&lt;p&gt;So, without further ado, let's get started with creating our Express app.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Creating the Express app
&lt;/h3&gt;

&lt;p&gt;Open the terminal (command prompt or Powershell on Windows), navigate to the directory where you want to create this project and create the project directory. We will be calling our project &lt;strong&gt;express-postgres-knex-app&lt;/strong&gt; (not very innovative I know :-))&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir express-postgres-knex-app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Go to the project directory and run the following command to generate some boilerplate code using &lt;strong&gt;express generator&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx express-generator
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
   create : public/
   create : public/javascripts/
   create : public/images/
   create : public/stylesheets/
   create : public/stylesheets/style.css
   create : routes/
   create : routes/index.js
   create : routes/users.js
   create : views/
   create : views/error.ejs
   create : views/index.ejs
   create : app.js
   create : package.json
   create : bin/
   create : bin/www

   install dependencies:
     $ npm install

   run the app:
     $ DEBUG=express-postgres-knex-app:* npm start

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create the some files and directories needed for a very basic Express application. We can customize it as per our requirements. Among other things, it will create an &lt;code&gt;app.js&lt;/code&gt; file and a &lt;code&gt;routes&lt;/code&gt; directory with &lt;code&gt;index.js&lt;/code&gt; and &lt;code&gt;users.js&lt;/code&gt; files inside. In order to run our application, we need to follow the instructions in the output shown above. First, install the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run the app using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;DEBUG=express-postgres-knex-app:* npm start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should start our server on port 3000. If you go to your browser, you should be able to see the express application on &lt;a href="http://localhost:3000"&gt;http://localhost:3000&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="/content/images/2021/01/Screenshot-from-2021-01-17-02-32-23.png" class="article-body-image-wrapper"&gt;&lt;img src="/content/images/2021/01/Screenshot-from-2021-01-17-02-32-23.png" alt="Screenshot-from-2021-01-17-02-32-23"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Creating the API endpoint with some hard coded data
&lt;/h3&gt;

&lt;p&gt;The &lt;strong&gt;express generator&lt;/strong&gt; automatically created a &lt;strong&gt;users&lt;/strong&gt; router for us. If you open the file &lt;code&gt;routes/users.js&lt;/code&gt;, you should see the code like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var express = require('express');
var router = express.Router();
const DB = require('../services/DB');

/* GET users listing. */
router.get('/', async function (req, res, next) {
  return res.send('respond with a resource');
});

module.exports = router;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, we need to return the users array instead of a string &lt;code&gt;respond with a resource&lt;/code&gt;. And we need to fetch those users from our database. So, for step 2, we don't need to do anything as we already have a route created for us by &lt;strong&gt;express generator&lt;/strong&gt;. In the later steps, we will modify this code to actually fetch the users from our database&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Creating a database for our app
&lt;/h3&gt;

&lt;p&gt;In this tutorial, we have a pre-requisite that postgres is installed on your machine. So, we need to connect to the postgres server and once you're inside, run the following command to create the database for our app:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;create database express-app;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Installing and configuring knex
&lt;/h3&gt;

&lt;p&gt;Install &lt;code&gt;knex&lt;/code&gt; and &lt;code&gt;pg&lt;/code&gt; modules (since we are using &lt;code&gt;postgres&lt;/code&gt;) by running the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install knex pg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once installed, initialize knex with a sample config file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;knex init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should create a &lt;code&gt;knexfile.js&lt;/code&gt; file in your project's root directory. This file contains the configuration to connect to the database. By default, the knexfile will be using &lt;code&gt;sqlite&lt;/code&gt; for &lt;code&gt;development&lt;/code&gt;. We need to change this since we are using &lt;code&gt;postgres&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Modify your &lt;code&gt;knexfile.js&lt;/code&gt; so it looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Update with your config settings.
const PGDB_PASSWORD = process.env.PGDB_PASSWORD;

module.exports = {
  development: {
    client: 'postgresql',
    connection: {
      host: 'localhost',
      database: 'express-app',
      user: 'postgres',
      password: PGDB_PASSWORD
    },
    pool: {
      min: 2,
      max: 10
    },
    migrations: {
      tableName: 'knex_migrations',
      directory: `${__dirname}/db/migrations`
    },
    seeds: {
      directory: `${__dirname}/db/seeds`
    }
  }
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, we need to create a service called &lt;strong&gt;DB&lt;/strong&gt; where we initialize &lt;strong&gt;knex&lt;/strong&gt; in our application with the config from &lt;code&gt;knexfile.js&lt;/code&gt;. In the project's root directory, create a directory &lt;code&gt;services&lt;/code&gt; and inside the &lt;code&gt;services&lt;/code&gt; directory, create a file &lt;code&gt;DB.js&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;In that file, add the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const config = require('../knexfile');

const knex = require('knex')(config[process.env.NODE_ENV]);

module.exports = knex;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, we are importing the config from &lt;code&gt;knexfile&lt;/code&gt; and initializing the &lt;code&gt;knex&lt;/code&gt; object using the same. Since, we will be running our app in &lt;code&gt;development&lt;/code&gt; mode, the value of &lt;code&gt;NODE_ENV&lt;/code&gt; will be &lt;code&gt;development&lt;/code&gt; and the config for the same will be picked from the &lt;code&gt;knexfile.js&lt;/code&gt;. If you run the app in production, you'll need to add the production config in the &lt;code&gt;knexfile.js&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Now, wherever in our app we need to pull data from the database, we need to import this &lt;code&gt;DB.js&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Populating seed data with knex
&lt;/h3&gt;

&lt;p&gt;So we have our express app up and running with knex integrated. And we have our postgres database created. But we don't have any tables and data in our database. In this step, we will use knex migrations and seed files to do the same.&lt;/p&gt;

&lt;p&gt;From the project's root directory, run the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx knex migrate:make initial_setup
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create a new file in the &lt;code&gt;db/migrations&lt;/code&gt; directory.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx knex seed:make initial_data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create a sample seed file under the &lt;code&gt;db/seeds&lt;/code&gt; directory. First, we need to modify our migration file to create the users table. Open the newly created file under &lt;code&gt;db/migrations&lt;/code&gt; directory and modify it so it looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exports.up = function (knex) {
  return knex.schema.createTable('users', function (table) {
    table.increments('id');
    table.string('name', 255).notNullable();
  });
};

exports.down = function (knex) {
  return knex.schema.dropTable('users');
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, in the &lt;code&gt;up&lt;/code&gt; function, we are creating a &lt;code&gt;users&lt;/code&gt; table with two fields: &lt;code&gt;id&lt;/code&gt; and &lt;code&gt;name&lt;/code&gt;. So, when we apply this migration, a new table will be created. And in the &lt;code&gt;down&lt;/code&gt; function, we are dropping the &lt;code&gt;users&lt;/code&gt; table. So, when we rollback our migration, the &lt;code&gt;users&lt;/code&gt; table will be deleted.&lt;/p&gt;

&lt;p&gt;Also, open the newly created file under &lt;code&gt;db/seeds&lt;/code&gt; directory and modify it so it looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exports.seed = function (knex) {
  // Deletes ALL existing entries
  return knex('users')
    .del()
    .then(function () {
      // Inserts seed entries
      return knex('users').insert([
        { id: 1, name: 'Alice' },
        { id: 2, name: 'Robert' },
        { id: 3, name: 'Eve' }
      ]);
    });
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will first remove any existing entries from our &lt;code&gt;users&lt;/code&gt; table and then populate the same with 3 users.&lt;/p&gt;

&lt;p&gt;Now, that we have our migration and seed files ready, we need to apply them. Run the following command to apply the migration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx knex migrate:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then run the following command to populate the seed data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx knex seed:run
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, if you connect to your postgres database, you should be able to see the &lt;code&gt;users&lt;/code&gt; table with 3 entries. Now that we have our users table ready with data, we need to update the &lt;code&gt;users.js&lt;/code&gt; file to fetch the entries from this table.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Updating the API endpoint created in step 2 to fetch the data from database instead of returning hard coded data
&lt;/h3&gt;

&lt;p&gt;Open the file &lt;code&gt;routes/users.js&lt;/code&gt; and modify the API endpoint to look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var express = require('express');
var router = express.Router();
const DB = require('../services/DB');

/* GET users listing. */
router.get('/', async function (req, res, next) {
  const users = await DB('users').select(['id', 'name']);
  return res.json(users);
});

module.exports = router;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, in the 3rd line we are importing the &lt;strong&gt;DB&lt;/strong&gt; service. Then inside our route handler, we are fetching the users using the Knex's query builder&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const users = await DB('users').select(['id', 'name']);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Knex does the job of translating this to an SQL query:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SELECT id, name FROM users;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then we return the users (array of JSON objects) to the response.&lt;/p&gt;

&lt;p&gt;Now, go to the terminal where you started the application earlier. Stop the server. If you remember in the knexfile we created earlier, we were using an environment variable &lt;code&gt;PGDB_PASSWORD&lt;/code&gt; for passing the postgres password to our config. So we will need to export this variable with the password of our postgres server&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export PGDB_PASSWORD=&amp;lt;enter your postgres password here&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run the Express server again&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;DEBUG=express-postgres-knex-app:* npm start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now if you go to the &lt;a href="http://localhost:3000/users"&gt;http://localhost:3000/users&lt;/a&gt; , you should see the JSON array of user objects fetched from your postgres database.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;So, in this article we created an Express JS app and connected it with a postgres database using Knex. We also touched upon the benefits of using a robust library like Knex for handling database operations in our application and learned about the concept of migrations. Hope you found this article helpful&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to install Elasticsearch 7 with Kibana using Docker Compose</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Thu, 08 Oct 2020 08:46:55 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-install-elasticsearch-7-with-kibana-using-docker-compose-5d37</link>
      <guid>https://dev.to/mandeepm91/how-to-install-elasticsearch-7-with-kibana-using-docker-compose-5d37</guid>
      <description>&lt;p&gt;This tutorial will help you setup a single node Elasticsearch cluster with Kibana using Docker Compose. &lt;/p&gt;

&lt;h3&gt;
  
  
  Pre-requisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Install Docker (refer to official docs if not installed &lt;a href="https://docs.docker.com/engine/install/"&gt;https://docs.docker.com/engine/install/&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Install Docker Compose (refer to official docs if not installed &lt;a href="https://docs.docker.com/compose/install/"&gt;https://docs.docker.com/compose/install/&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This tutorial assumes you are comfortable with Docker and Docker Compose. If you are not, you can go through this article of mine which is kind of a crash course with Docker Compose (&lt;a href="https://medium.com/swlh/simplifying-development-on-your-local-machine-using-docker-and-docker-compose-2b9ef31bdbe7?source=friends_link&amp;amp;sk=240efed3fd3a43a1779e7066edb37235"&gt;https://medium.com/swlh/simplifying-development-on-your-local-machine-using-docker-and-docker-compose-2b9ef31bdbe7?source=friends_link&amp;amp;sk=240efed3fd3a43a1779e7066edb37235&lt;/a&gt;) &lt;/p&gt;

&lt;h3&gt;
  
  
  Video Lesson
&lt;/h3&gt;

&lt;p&gt;I have also created a video tutorial for this on my YouTube channel. If you prefer that, you may visit the link below and check it out&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=EClKhOE0p-o"&gt;https://www.youtube.com/watch?v=EClKhOE0p-o&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Create docker-compose.yml file
&lt;/h3&gt;

&lt;p&gt;Create a directory on your machine for this project&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir $HOME/elasticsearch7-docker
cd $HOME/elasticsearch7-docker
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Inside that directory create a &lt;code&gt;docker-compose.yml&lt;/code&gt; file with contents as shown below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.9.2-amd64
    env_file:
      - elasticsearch.env
    volumes:
      - ./data/elasticsearch:/usr/share/elasticsearch/data

  kibana:
    image: docker.elastic.co/kibana/kibana:7.9.2
    env_file:
      - kibana.env
    ports:
      - 5601:5601

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: Create the env files
&lt;/h3&gt;

&lt;p&gt;Both Elasticsearch and Kibana docker images allow us to pass on environment variables which are passed on to the configuration as defined in &lt;code&gt;elasticsearch.yml&lt;/code&gt; and &lt;code&gt;kibana.yml&lt;/code&gt; files. For passing the environment variables to container, we can use the &lt;code&gt;env_file&lt;/code&gt; setting of the docker compose file.&lt;/p&gt;

&lt;p&gt;Create the &lt;code&gt;elasticsearch.env&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cluster.name=my-awesome-elasticsearch-cluster
network.host=0.0.0.0
bootstrap.memory_lock=true
discovery.type=single-node
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Note: With latest version of Elasticsearch, it is necessary to set the option &lt;code&gt;discovery.type=single-node&lt;/code&gt; for a single node cluster otherwise it won't start&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create &lt;code&gt;kibana.env&lt;/code&gt; file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SERVER_HOST="0"
ELASTICSEARCH_URL=http://elasticsearch:9200
XPACK_SECURITY_ENABLED=false
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Create Elasticsearch data directory
&lt;/h3&gt;

&lt;p&gt;Navigate to the directory where you have created your &lt;code&gt;docker-compose.yml&lt;/code&gt; file and create a subdirectory &lt;code&gt;data&lt;/code&gt;. Then inside the &lt;code&gt;data&lt;/code&gt; directory create another directory &lt;code&gt;elasticsearch&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir data
cd data
mkdir elasticsearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will be mounting this directory to the data directory of &lt;code&gt;elasticsearch&lt;/code&gt; container. In your &lt;code&gt;docker-compose.yml&lt;/code&gt; file there are these lines:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    volumes:
      - ./data/elasticsearch:/usr/share/elasticsearch/data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This ensures that the data on your Elasticsearch container persists even when the container is stopped and restarted later. So, you won't lose your indices when you restart the containers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Run the setup
&lt;/h3&gt;

&lt;p&gt;We're good to go now. Open your terminal and navigate to the folder containing your &lt;code&gt;docker-compose.yml&lt;/code&gt; file and run the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose up -d
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will start pulling the images from docker.elastic.co and depending on your internet speed, this should take a while. Once the images are pulled, it will start the containers.&lt;/p&gt;

&lt;p&gt;You can run the following command to see if both the containers are running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose ps
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output should look something like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;                Name                              Command               State           Ports         
------------------------------------------------------------------------------------------------------
docker-elasticsearch-setup_elasticsearch_1   /tini -- /usr/local/bin/do ...   Up      9200/tcp, 9300/tcp    
docker-elasticsearch-setup_kibana_1          /usr/local/bin/dumb-init - ...   Up      0.0.0.0:5601-&amp;gt;5601/tcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice the &lt;code&gt;State&lt;/code&gt; field. It should be &lt;code&gt;Up&lt;/code&gt; for both the containers. If it is not, then check the logs using the command (replace {serviceName} with the name of the service, eg &lt;code&gt;elasticsearch&lt;/code&gt; or &lt;code&gt;kibana&lt;/code&gt;)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose logs -f {serviceName}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A common error that you might encounter is related to &lt;code&gt;vm.max_map_count&lt;/code&gt; being too low. You can fix it by running the command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sysctl -w vm.max_map_count=262144
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check this link for more details &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/vm-max-map-count.html"&gt;https://www.elastic.co/guide/en/elasticsearch/reference/current/vm-max-map-count.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If both the services are running fine, you should be able to see kibana console on &lt;a href="http://localhost:5601"&gt;http://localhost:5601&lt;/a&gt; on your web browser. Give it a few minutes as it takes some time for Elasticsearch cluster to be ready and for Kibana to connect to it. You can get more info by inspecting the logs using &lt;code&gt;docker-compose logs -f kibana&lt;/code&gt; command. &lt;/p&gt;




&lt;p&gt;This completes our setup. Hope you found it helpful. Happy coding :-)&lt;/p&gt;

</description>
      <category>docker</category>
    </item>
    <item>
      <title>Cloning an object in JavaScript and avoiding Gotchas</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Tue, 29 Sep 2020 21:13:10 +0000</pubDate>
      <link>https://dev.to/mandeepm91/cloning-an-object-in-javascript-and-avoiding-gotchas-pec</link>
      <guid>https://dev.to/mandeepm91/cloning-an-object-in-javascript-and-avoiding-gotchas-pec</guid>
      <description>&lt;p&gt;If you're a JavaScript developer, you must have come across scenarios where you need to clone an object. How do you do it? In this article we will cover various approaches to clone an object in JavaScript and their shortcomings and finally talk about the most reliable way to make a deep copy (clone) of an object in JavaScript.&lt;/p&gt;

&lt;p&gt;Let us consider that our object to be cloned is this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const person = {
  name: 'Dolores Abernathy',
  age: 32,
  dob: new Date('1988-09-01')
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There can be various ways to clone it:&lt;/p&gt;

&lt;p&gt;One way would be to declare a new variable and point it to the original object (which is not exactly cloning the object)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const clone = person
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What you're doing here is you are referencing the same object. If you change &lt;code&gt;clone.name&lt;/code&gt;, &lt;code&gt;person.name&lt;/code&gt; will also change. Most of the times, this is not what you intend to do when you want to clone an object. You would want a copy of the object which does not share anything with the original object. Here, &lt;code&gt;clone&lt;/code&gt; is just a reference to the same object being referred by &lt;code&gt;person&lt;/code&gt;. Most of the JavaScript developers would know about this. So, this is not really a "Gotcha!". But the next two approaches I am going to show are definitely something you need to watch out for.&lt;/p&gt;

&lt;p&gt;You'll often see code using spread operator to clone an object. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const clone = { ...person }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or code using &lt;code&gt;Object.assign&lt;/code&gt; like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const clone = Object.assign({}, person)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One might assume in both of the above cases that &lt;code&gt;clone&lt;/code&gt; is a copy of the original &lt;code&gt;person&lt;/code&gt; object and does not share anything with the original object. This is partially correct but can you guess the output of the code below? (Please take a moment to think what the output should be before copy pasting it)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const person = {
  name: 'Dolores Abernathy',
  age: 32,
  dob: new Date('1988-09-01')
}

const clone = { ...person }

// change the year for person.dob
person.dob.setYear(1986)

// check the clone's dob year
console.log(clone.dob.getFullYear())
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What was your guess? &lt;code&gt;1988&lt;/code&gt;? &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fBOtNCeo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://codingfundas.com/content/images/2020/09/2020-09-27-02-49-13.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fBOtNCeo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://codingfundas.com/content/images/2020/09/2020-09-27-02-49-13.png" alt="2020-09-27-02-49-13"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The correct answer is &lt;code&gt;1986&lt;/code&gt;. If you guessed the right answer and know the reason behind it, good! You have strong JavaScript fundamentals. But if you guessed it wrong, that's ok. It's the reason why I am sharing this blog post because a lot of us assume that by using the spread operator, we are creating a completely separate copy of the object. But this is not true. Same thing would happen with &lt;code&gt;Object.assign({}, person)&lt;/code&gt; as well. &lt;/p&gt;

&lt;p&gt;Both these approaches create a shallow copy of the original object. What does that mean? It means that all the fields of the original object which are primitive data types, will be copied by value but the object data types will be copied by reference.&lt;/p&gt;

&lt;p&gt;In our original object, &lt;code&gt;name&lt;/code&gt; and &lt;code&gt;age&lt;/code&gt; are both primitive data types. So, changing &lt;code&gt;person.name&lt;/code&gt; or &lt;code&gt;person.age&lt;/code&gt; does not affect those fields in the &lt;code&gt;clone&lt;/code&gt; object. However, &lt;code&gt;dob&lt;/code&gt; is a &lt;code&gt;date&lt;/code&gt; field which is not a primitive datatype. Hence, it is passed by reference. And when we change anything in &lt;code&gt;dob&lt;/code&gt; field of the &lt;code&gt;person&lt;/code&gt; object, we also modify the same in &lt;code&gt;clone&lt;/code&gt; object.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to create a deep copy of an object ?
&lt;/h3&gt;

&lt;p&gt;Now that we know that both the spread operator and the &lt;code&gt;Object.assign&lt;/code&gt; method create shallow copies of an object, how do we create a deep copy. When I say deep copy, I mean that the cloned object should be a completely independent copy of the original object and changing anything in one of those should not change anything in the other one.&lt;/p&gt;

&lt;p&gt;Some people try &lt;code&gt;JSON.parse&lt;/code&gt; and &lt;code&gt;JSON.stringify&lt;/code&gt; combination for this. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const person = {
  name: 'Dolores Abernathy',
  age: 32,
  dob: new Date('1988-09-01')
}

const clone = JSON.parse(JSON.stringify(person))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;While it's not a bad approach, it has its shortcomings and you need to understand where to avoid using this approach.&lt;/p&gt;

&lt;p&gt;In our example, &lt;code&gt;dob&lt;/code&gt; is a date field. When we do &lt;code&gt;JSON.stringify&lt;/code&gt;, it is converted to date string. And then when we do &lt;code&gt;JSON.parse&lt;/code&gt;, the &lt;code&gt;dob&lt;/code&gt; field remains a string and is not converted back to a date object. So, while &lt;code&gt;clone&lt;/code&gt; is a completely independent copy of the &lt;code&gt;person&lt;/code&gt; in this case, it is not an exact copy because the data type of &lt;code&gt;dob&lt;/code&gt; field is different in both the objects.&lt;/p&gt;

&lt;p&gt;You can try yourself&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;console.log(person.dob.constructor) // [Function: Date]
console.log(clone.dob.constructor) // [Function: String]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This approach also doesn't work if any of the fields in the original object is a function. For example&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const person = {
  name: 'Dolores Abernathy',
  age: 32,
  dob: new Date('1988-09-01'),
  getFirstName: function() {
    console.log(this.name.split(' ')[0])
  }
}

const clone = JSON.parse(JSON.stringify(person))

console.log(Object.keys(person)) // [ 'name', 'age', 'dob', 'getFirstName' ]

console.log(Object.keys(clone)) // [ 'name', 'age', 'dob' ]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice that the &lt;code&gt;getFirstName&lt;/code&gt; is missing in the clone object because it was skipped in the &lt;code&gt;JSON.stringify&lt;/code&gt; operation as it is a function.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is a reliable way to make a deep copy/clone of an object then?
&lt;/h3&gt;

&lt;p&gt;Up until now all the approaches we have discussed have had some shortcomings. Now we will talk about the aproach that doesn't. If you need to make a truly deep clone of an object in JavaScript, use a third party library like &lt;code&gt;lodash&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const _ = require('lodash')

const person = {
  name: 'Dolores Abernathy',
  age: 32,
  dob: new Date('1988-09-01'),
  getFirstName: function() {
    console.log(this.name.split(' ')[0])
  }
}

const clone = _.cloneDeep(person)

// change the year for person.dob
person.dob.setYear(1986)

// check clone's dob year
console.log(clone.dob.getFullYear() // should be 1988

// Check that all fields (including function getFirstName) are copied to new object
console.log(Object.keys(clone)) // [ 'name', 'age', 'dob', 'getFirstName' ]

// check the data type of dob field in clone
console.log(clone.dob.constructor) // [Function: Date]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can see that the &lt;code&gt;cloneDeep&lt;/code&gt; function of &lt;code&gt;lodash&lt;/code&gt; library will make a truly deep copy of an object.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Now that you know different ways of copying an object in JavaScript and pros and cons of each approach, I hope that this will help you in making a more informed decision on which approach to use for your use case and avoid any "Gotchas" while writing code.&lt;/p&gt;

&lt;p&gt;Happy Coding :-) &lt;/p&gt;

</description>
      <category>javascript</category>
    </item>
    <item>
      <title>How to add third party scripts &amp; inline scripts in your Nuxt.js app?</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Mon, 23 Dec 2019 07:21:23 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-add-third-party-scripts-inline-scripts-in-your-nuxt-js-app-1nbf</link>
      <guid>https://dev.to/mandeepm91/how-to-add-third-party-scripts-inline-scripts-in-your-nuxt-js-app-1nbf</guid>
      <description>&lt;h3&gt;
  
  
  Problem statement
&lt;/h3&gt;

&lt;p&gt;Let's say you have created a Nuxt app and one day your client or your boss asks you to add some snippet of code to every page of the site for analytics purposes. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;!-- Global site tag (gtag.js) - Google Analytics --&amp;gt;
&amp;lt;script async src="https://www.googletagmanager.com/gtag/js?id=UA-111111111-1"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script&amp;gt;
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag('js', new Date());

  gtag('config', 'UA-111111111-1');
&amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Solution
&lt;/h3&gt;

&lt;p&gt;Open your &lt;strong&gt;nuxt.config.js&lt;/strong&gt; file and update the &lt;strong&gt;head&lt;/strong&gt; section as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  head: {
    __dangerouslyDisableSanitizers: ['script'],
    script: [
      {
        hid: 'gtm-script1',
        src: 'https://www.googletagmanager.com/gtag/js?id=UA-111111111-1',
        defer: true
      },
      {
        hid: 'gtm-script2',
        innerHTML: `
          window.dataLayer = window.dataLayer || [];
          function gtag(){dataLayer.push(arguments);}
          gtag('js', new Date());

          gtag('config', 'UA-111111111-1');
        `,
        type: 'text/javascript',
        charset: 'utf-8'
      }
    ]
  },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see the &lt;strong&gt;script&lt;/strong&gt; array contains two objects. First one is to include the external script from &lt;code&gt;googletagmanager.com&lt;/code&gt;. The second object shows how to include inline Javascript. For that to work however, you need to add the setting &lt;code&gt;__dangerouslyDisableSanitizers: ['script'],&lt;/code&gt;. I am not sure if this is the best or even the recommended approach but it worked for me. If you happen to know a better alternative, I would definitely love to know. You can mention in the comments section below or tag me on twitter. &lt;/p&gt;

&lt;p&gt;Thanks and happy coding :-)&lt;/p&gt;

</description>
      <category>vue</category>
      <category>javascript</category>
    </item>
    <item>
      <title>How to add authentication to your universal Nuxt app using nuxt/auth module?</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Mon, 30 Sep 2019 17:05:50 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-add-authentication-to-your-universal-nuxt-app-using-nuxt-auth-module-3ffm</link>
      <guid>https://dev.to/mandeepm91/how-to-add-authentication-to-your-universal-nuxt-app-using-nuxt-auth-module-3ffm</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2Fnuxt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2Fnuxt.png" alt="nuxt"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Recently I was working on a Nuxt.js app and had to add authentication to it. First thing I thought was to use vuex to store two fields in a state:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;isLoggedIn&lt;/strong&gt;: a boolean representing whether user is logged in or not&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;loggedInUser&lt;/strong&gt;: an object containing the user details for the session that we get from server&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And then I added a middleware on pages where I wanted to restrict access to logged in users only. The thought process for this approach is right but problem is that when you refresh the page, the state from vuex is lost. In order to handle that, you would need to use &lt;strong&gt;localStorage&lt;/strong&gt; but that would work only if your app is running in &lt;strong&gt;spa&lt;/strong&gt; mode, that is, on client side only. If you are running your app in &lt;strong&gt;universal&lt;/strong&gt; mode (server side rendered) then you will also need to use &lt;strong&gt;cookies&lt;/strong&gt; and write a custom middleware that checks whether it is running on client side or server side and then use &lt;strong&gt;localStorage&lt;/strong&gt; or &lt;strong&gt;cookies&lt;/strong&gt; accordingly. Doing all of this would be a good exercise to learn how everything works but adding it to a project where multiple people are working might not be a great idea in my opinion. Nuxt has an officially supported module just for this purpose. It's the &lt;a href="https://auth.nuxtjs.org/" rel="noopener noreferrer"&gt;auth module&lt;/a&gt;. In this post, I will talk about how to integrate the &lt;strong&gt;auth module&lt;/strong&gt; to your nuxt app to support authentication using &lt;strong&gt;email&lt;/strong&gt; and &lt;strong&gt;password&lt;/strong&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Assumptions for the server API
&lt;/h3&gt;

&lt;p&gt;We are making the assumption that the API server:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Is running on &lt;strong&gt;&lt;a href="http://localhost:8080/v1" rel="noopener noreferrer"&gt;http://localhost:8080/v1&lt;/a&gt;&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Uses cookie based sessions&lt;/li&gt;
&lt;li&gt;Has a JSON based API&lt;/li&gt;
&lt;li&gt;Has the following API endpoints:

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;POST /v1/auth/login&lt;/strong&gt;: accepts &lt;strong&gt;email&lt;/strong&gt; and &lt;strong&gt;password&lt;/strong&gt; in request body and authenticates the user&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;POST /v1/auth/logout&lt;/strong&gt;: does not need request body and deletes the user session from server&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GET /v1/auth/profile&lt;/strong&gt;: returns the logged in user's object&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Overview of the steps involved
&lt;/h3&gt;

&lt;p&gt;We will divide this post into following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Installation of &lt;strong&gt;axios&lt;/strong&gt; and &lt;strong&gt;auth&lt;/strong&gt; modules&lt;/li&gt;
&lt;li&gt;Configuration needed in &lt;strong&gt;nuxt.config.js&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Using the state from auth module to check if user is logged in or not and accessing logged in user in our app components&lt;/li&gt;
&lt;li&gt;Using the auth module to authenticate the user using email and password based authentication&lt;/li&gt;
&lt;li&gt;Using middleware provided by the auth module to restrict access to pages to logged in users only&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 1: Install the axios and auth modules
&lt;/h3&gt;

&lt;p&gt;Open the terminal, navigate to the root directory of your project and run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install @nuxtjs/auth @nuxtjs/axios
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: Configure axios and auth modules
&lt;/h3&gt;

&lt;p&gt;Open your &lt;strong&gt;nuxt.config.js&lt;/strong&gt; file, find the &lt;strong&gt;modules&lt;/strong&gt; section and include the &lt;strong&gt;axios&lt;/strong&gt; and &lt;strong&gt;auth&lt;/strong&gt; modules and add their  configuration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  modules: [
    '@nuxtjs/axios',
    '@nuxtjs/auth'
  ],

  auth: {
    strategies: {
      local: {
        endpoints: {
          login: {
            url: '/auth/login',
            method: 'post',
            propertyName: false
          },
          logout: { 
            url: '/auth/logout', 
            method: 'post' 
          },
          user: { 
            url: '/auth/profile', 
            method: 'get', 
            propertyName: false 
          }
        },
        tokenRequired: false,
        tokenType: false
      }
    }
  },

  axios: {
    baseURL: 'http://localhost:8080/v1',
    credentials: true
  },

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;strong&gt;auth&lt;/strong&gt; object here includes the configuration. The &lt;strong&gt;auth&lt;/strong&gt; module supports &lt;strong&gt;local&lt;/strong&gt; strategy as well as &lt;strong&gt;OAuth2&lt;/strong&gt;. Since we only have email and password based authentication in our case, we only need to provide the configuration for &lt;strong&gt;local&lt;/strong&gt; strategy. &lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;endpoints&lt;/strong&gt; section is where we specify the details about our API server's endpoints for login, logout and logged in user's profile and each of the config looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  user: { 
    url: '/auth/profile', 
    method: 'get', 
    propertyName: false 
  }          
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;url&lt;/strong&gt; and &lt;strong&gt;method&lt;/strong&gt; should be consistent with your server API. The &lt;strong&gt;url&lt;/strong&gt; here needs to be relative to the &lt;strong&gt;baseUrl&lt;/strong&gt; config. The &lt;strong&gt;propertyName&lt;/strong&gt; tells the auth module which property in the response object to look for. For example, if your API server reponse for &lt;code&gt;GET /auth/profile&lt;/code&gt; is like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "user": {
    "id: 1,
    "name": "Jon Snow",
    "email": "jon.snow@asoiaf.com"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then you can set the &lt;strong&gt;propertyName&lt;/strong&gt; as &lt;code&gt;user&lt;/code&gt; to look for only the &lt;code&gt;user&lt;/code&gt; key in the API response. If you want to use the entire API response, you need to set &lt;strong&gt;propertyName&lt;/strong&gt; to &lt;code&gt;false&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Since our API server has cookie based sessions, we are setting the &lt;strong&gt;tokenRequired&lt;/strong&gt; and &lt;strong&gt;tokenType&lt;/strong&gt; to &lt;code&gt;false&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;tokenRequired: false,
tokenType: false
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For a complete list of options supported by the auth module, you can visit their official documentation &lt;a href="https://auth.nuxtjs.org/api/options.html#redirect" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;axios&lt;/strong&gt; object in the above config is used to provide the axios configuration. Here, we are setting the following properties:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  axios: {
    baseURL: 'http://localhost:8080/v1',
    credentials: true
  },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;baseUrl&lt;/strong&gt; here is the root url of our API and any relative url that we hit using axios in our app will be relative to this url. Setting &lt;strong&gt;credentials&lt;/strong&gt; as &lt;code&gt;true&lt;/code&gt; ensures that we send the authentication headers to the API server in all requests. &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Activate vuex store in your app
&lt;/h3&gt;

&lt;p&gt;In order to use the auth module, we need to activate &lt;strong&gt;vuex&lt;/strong&gt; store in our application since that's where the session related information will be stored. This can be done by adding any &lt;strong&gt;.js&lt;/strong&gt; file inside the &lt;strong&gt;store&lt;/strong&gt; directory of your app and nuxt will register a namespaced vuex module with the name of the file. Let's go ahead and add a blank file called &lt;strong&gt;index.js&lt;/strong&gt; to the &lt;strong&gt;store&lt;/strong&gt; directory of our app. It's not mandatory to add &lt;strong&gt;index.js&lt;/strong&gt; file. You could have added any file for example &lt;strong&gt;xyz.js&lt;/strong&gt; in the &lt;strong&gt;store&lt;/strong&gt; directory and that would have activated vuex store in your app.&lt;/p&gt;

&lt;p&gt;The auth module that we have included in our project will automatically register a namespaced module named &lt;strong&gt;auth&lt;/strong&gt; with the vuex store. And it has the following fields in the state:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;loggedIn&lt;/strong&gt;: A boolean denoting if the user is logged in or not&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;user&lt;/strong&gt;: the user object as received from &lt;strong&gt;auth.strategies.local.user&lt;/strong&gt; endpoint configured in our &lt;strong&gt;nuxt.config.js&lt;/strong&gt; file.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;strategy&lt;/strong&gt;: This will be &lt;code&gt;local&lt;/code&gt; in our case&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It also adds the necessary mutations for setting the state. So, even though we haven't created any &lt;strong&gt;auth.js&lt;/strong&gt; file in the &lt;strong&gt;store&lt;/strong&gt; directory of our app, the auth module has automatically taken care of all this. If it helps to understand, imagine that a file named &lt;strong&gt;auth.js&lt;/strong&gt; is automatically created by auth module in the &lt;strong&gt;store&lt;/strong&gt; directory of your app even though this file doesn't actually exists. This means that using &lt;strong&gt;mapState&lt;/strong&gt; on &lt;strong&gt;auth&lt;/strong&gt; module of your vuex store will work. For example, you can use this in any of your components or pages:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  computed: {
    ...mapState('auth', ['loggedIn', 'user'])
  },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here is a complete example of a component using these properties:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;template&amp;gt;
  &amp;lt;b-navbar type="dark" variant="dark"&amp;gt;
    &amp;lt;b-navbar-brand to="/"&amp;gt;NavBar&amp;lt;/b-navbar-brand&amp;gt;
    &amp;lt;b-navbar-nav class="ml-auto"&amp;gt;
      &amp;lt;b-nav-item v-if="!loggedIn" to="/login"&amp;gt;Login&amp;lt;/b-nav-item&amp;gt;
      &amp;lt;b-nav-item v-if="!loggedIn" to="/register"&amp;gt;Register&amp;lt;/b-nav-item&amp;gt;
      &amp;lt;b-nav-item v-if="loggedIn" @click="logout"&amp;gt;
        &amp;lt;em&amp;gt;Hello {{ user.name }}&amp;lt;/em&amp;gt;
      &amp;lt;/b-nav-item&amp;gt;
      &amp;lt;b-nav-item v-if="loggedIn" @click="logout"&amp;gt;Logout&amp;lt;/b-nav-item&amp;gt;
    &amp;lt;/b-navbar-nav&amp;gt;
  &amp;lt;/b-navbar&amp;gt;
&amp;lt;/template&amp;gt;

&amp;lt;script&amp;gt;
import { mapState } from 'vuex'
export default {
  name: 'NavBar',
  computed: {
    ...mapState('auth', ['loggedIn', 'user'])
  },
  methods: {
    async logout() {
      await this.$auth.logout()
      this.$router.push('/login')
    }
  }
}
&amp;lt;/script&amp;gt;

&amp;lt;style&amp;gt;&amp;lt;/style&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Alternative approach
&lt;/h4&gt;

&lt;p&gt;Instead of using the &lt;strong&gt;mapState&lt;/strong&gt;, you can also reference the &lt;strong&gt;loggedIn&lt;/strong&gt; and &lt;strong&gt;user&lt;/strong&gt; by &lt;strong&gt;this.$auth.loggedIn&lt;/strong&gt; and  &lt;strong&gt;this.$auth.user&lt;/strong&gt;. So, in the above example, you could have re-written the computed properties as mentioned below and it would have still worked fine:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  computed: {
    loggedIn() {
      return this.$auth.loggedIn
    },
    user() {
      return this.$auth.user
    }
  },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Authenticating user using the auth module
&lt;/h3&gt;

&lt;p&gt;We know how to use the auth module's APIs to check whether a user is logged in or not, or access the logged in user's details. But we haven't yet covered the part of how to authenticate the user. This is done by using the &lt;strong&gt;this.$auth.loginWith&lt;/strong&gt; method provided by the &lt;strong&gt;auth&lt;/strong&gt; module in any of your components or pages. The first argument to this function is the name of the strategy. In our case this will be &lt;code&gt;local&lt;/code&gt;. It's an async function which returns a promise. Here is an example of how to use it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  try {
    await this.$auth.loginWith('local', {
      data: {
        email: 'email@xyz.com'
        password: 'password',
      }
    })
    // do something on success
  } catch (e) {    
    // do something on failure 
  }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, typically you would have a login page with a form with &lt;strong&gt;email&lt;/strong&gt; and &lt;strong&gt;password&lt;/strong&gt; fields mapped to &lt;strong&gt;data&lt;/strong&gt; of the component using &lt;strong&gt;v-model&lt;/strong&gt;. And once you submit the form, you can run this function to authenticate using the &lt;strong&gt;auth&lt;/strong&gt; module. Here is an example of login page:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;template&amp;gt;
  &amp;lt;div class="row"&amp;gt;
    &amp;lt;div class="mx-auto col-md-4 mt-5"&amp;gt;
      &amp;lt;b-card&amp;gt;
        &amp;lt;b-form @submit="submitForm"&amp;gt;
          &amp;lt;b-form-group
            id="input-group-1"
            label="Email address:"
            label-for="email"
          &amp;gt;
            &amp;lt;b-form-input
              id="email"
              v-model="email"
              type="email"
              required
              placeholder="Enter email"
            &amp;gt;&amp;lt;/b-form-input&amp;gt;
          &amp;lt;/b-form-group&amp;gt;

          &amp;lt;b-form-group
            id="input-group-2"
            label="Password:"
            label-for="password"
          &amp;gt;
            &amp;lt;b-form-input
              id="password"
              v-model="password"
              type="password"
              required
              placeholder="Enter password"
            &amp;gt;&amp;lt;/b-form-input&amp;gt;
          &amp;lt;/b-form-group&amp;gt;

          &amp;lt;b-button type="submit" variant="primary"&amp;gt;Login&amp;lt;/b-button&amp;gt;
        &amp;lt;/b-form&amp;gt;
      &amp;lt;/b-card&amp;gt;
    &amp;lt;/div&amp;gt;
  &amp;lt;/div&amp;gt;
&amp;lt;/template&amp;gt;

&amp;lt;script&amp;gt;
export default {
  name: 'LoginPage',
  data() {
    return {
      email: '',
      password: ''
    }
  },
  methods: {
    async submitForm(evt) {
      evt.preventDefault()
      const credentials = {
        email: this.email,
        password: this.password
      }
      try {
        await this.$auth.loginWith('local', {
          data: credentials
        })
        this.$router.push('/')
      } catch (e) {
        this.$router.push('/login')
      }
    }
  }
}
&amp;lt;/script&amp;gt;

&amp;lt;style&amp;gt;&amp;lt;/style&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In order to logout a logged in user, you can use the &lt;strong&gt;this.$auth.logout&lt;/strong&gt; method provided by the &lt;strong&gt;auth&lt;/strong&gt; module. This one doesn't need any arguments. Here is an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  methods: {
    async logout() {
      await this.$auth.logout()
      this.$router.push('/login')
    }
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 5: Using auth middleware to restrict access to certain pages
&lt;/h3&gt;

&lt;p&gt;The &lt;strong&gt;auth&lt;/strong&gt; module also provides middleware to restrict access to logged in users. So, for example if you want to restrict the &lt;strong&gt;/profile&lt;/strong&gt; route of your application to logged in users only, you can add the auth middleware to the &lt;strong&gt;profile.vue&lt;/strong&gt; page like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export default {
  name: 'ProfilePage',
  middleware: ['auth']
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For more details on how you can configure your components and pages to use the &lt;code&gt;auth&lt;/code&gt; middleware, you can check out the official docs &lt;a href="https://auth.nuxtjs.org/guide/middleware.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion and References
&lt;/h3&gt;

&lt;p&gt;This was kind of a getting started post for &lt;strong&gt;axios&lt;/strong&gt; and &lt;strong&gt;auth&lt;/strong&gt; modules with &lt;strong&gt;NuxtJS&lt;/strong&gt;. We only covered the local strategy but the auth module also supports &lt;strong&gt;OAuth2&lt;/strong&gt; and can be used to support login using &lt;strong&gt;Auth0&lt;/strong&gt;, &lt;strong&gt;Facebook&lt;/strong&gt;, &lt;strong&gt;Github&lt;/strong&gt; and &lt;strong&gt;Google&lt;/strong&gt;. I would definitely recommend checking out the &lt;strong&gt;Guide&lt;/strong&gt; and &lt;strong&gt;API&lt;/strong&gt; section of the auth module:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://auth.nuxtjs.org/" rel="noopener noreferrer"&gt;https://auth.nuxtjs.org/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;axios&lt;/strong&gt; module also provides us many configuration options. Although we didn't cover much of it in this post, but I would definitely recommend checking out the official docs for that as well:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://axios.nuxtjs.org/" rel="noopener noreferrer"&gt;https://axios.nuxtjs.org/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope this post was helpful in understanding the basics of auth module in Nuxt and makes it easier for you to navigate the rest of the official documentation on your own.&lt;/p&gt;

&lt;p&gt;Happy coding :-)&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>vue</category>
    </item>
    <item>
      <title>How to SSH to AWS servers using an SSH config file?</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Wed, 25 Sep 2019 17:20:02 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-ssh-to-aws-servers-using-an-ssh-config-file-3adn</link>
      <guid>https://dev.to/mandeepm91/how-to-ssh-to-aws-servers-using-an-ssh-config-file-3adn</guid>
      <description>&lt;p&gt;How do you usually SSH to an AWS (Amazon Web Services) EC2 instance? If your answer is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh -i &amp;lt;your pem file&amp;gt; &amp;lt;username&amp;gt;@&amp;lt;ip address of server&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then you should read this tutorial.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is an SSH config file and why should I even bother to know?
&lt;/h2&gt;

&lt;p&gt;The above mentioned method for connecting to an AWS EC2 instance or any remote server is absolutely correct. There is nothing wrong with it and it is a highly secure way of connecting to a remote server. But imagine yourself having to connect to 15 different servers almost every day (15 different IP addresses to remember) and each of them having a different private key file (the pem file in above example). Let's say in some of the servers you need to conect as user &lt;strong&gt;ubuntu&lt;/strong&gt; and some of the servers you need to connect as user &lt;strong&gt;ec2-user&lt;/strong&gt;, etc. Also, let us say you want some port forwarding (more on this later) in some of those connections. Remembering all these configs for even a handful of servers can be a pain and it becomes a mess to handle everything with the above mentioned method. Do you see the ugliness of it, the disarray? Would it not be much easier if you could just write the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh dev-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh production-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Imagine executing this command from any directory, without bothering to remember the location of your pem files (private keys), the username with which you want to connect and the IP address of the server. This would make life so much better. That's exactly what an SSH config file is meant for. As its name suggests, it's a file where you provide all sorts of configuration options like the server IP address, location of the private key file, username, port forwarding, etc. And here you provide an easy to remember name for the servers like &lt;strong&gt;dev-server&lt;/strong&gt; or &lt;strong&gt;production-server&lt;/strong&gt;, etc. &lt;/p&gt;

&lt;p&gt;Now, do you see the beauty of it? The possibilities, the wonder? Well, if you do and you wish to learn how to explore these possibilities, then read on.&lt;/p&gt;

&lt;h2&gt;
  
  
  What we will do in this tutorial?
&lt;/h2&gt;

&lt;p&gt;We will quickly go through a brief introduction of SSH and the concept of private and public keys. Then we will see how to SSH to an AWS instance without using a &lt;strong&gt;config&lt;/strong&gt; file. Then we will learn how to connect to the same instance using an SSH &lt;strong&gt;config&lt;/strong&gt; file instead. So, this brings us to our first question&lt;/p&gt;

&lt;h3&gt;
  
  
  What is SSH?
&lt;/h3&gt;

&lt;p&gt;SSH stands for secure shell. Wikipedia defintion says:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Secure Shell (SSH) is a cryptographic network protocol for operating network services securely over an unsecured network&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In very simple terms, it is a secure way of logging in to a remote server. It gives you a terminal to the remote server where you can execute the shell commands.&lt;/p&gt;

&lt;h3&gt;
  
  
  How does it work?
&lt;/h3&gt;

&lt;p&gt;When you wish to connect to a remote server using SSH from your local machine, your local machine here is a client and the remote server is a server. The client machine needs a process called &lt;strong&gt;ssh client&lt;/strong&gt; whose task is to initiate ssh connection requests and participate in the process of establishing the connection with the server. And the remote server needs to run a process called &lt;strong&gt;ssh server&lt;/strong&gt; whose task is to listen for ssh connection requests, authenticate those and provide access to the remote server shell based on successful authentication. We provide the server ip address, username with which we wish to login, password or private key to the &lt;strong&gt;ssh client&lt;/strong&gt; when we wish to connect to a remote server. &lt;/p&gt;

&lt;h3&gt;
  
  
  What are public and private keys?
&lt;/h3&gt;

&lt;p&gt;Typically, when we connect to a remote server via SSH, we do it using a public-private key based authentication. Public and private keys are basically base64 encoded strings which are stored in files. They are generated in pairs. Think of them as two different keys which are needed together to open a lock. And think of the process of establishing an SSH connection as a process of opening a lock. This process requires two keys of the same pair, a private key and its corresponding public key. We keep our private key file on our local machine and the server needs to store our public key. &lt;/p&gt;

&lt;p&gt;Let us say we wish to login to a hypothetical remote server with IP address &lt;strong&gt;54.0.0.121&lt;/strong&gt; with username &lt;strong&gt;john&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;We give our &lt;strong&gt;ssh_client&lt;/strong&gt; the address of the server (&lt;strong&gt;54.0.0.121&lt;/strong&gt;), the username with which we wish to login (&lt;strong&gt;john&lt;/strong&gt;) and the private key file to use. The &lt;strong&gt;ssh client&lt;/strong&gt; goes to the &lt;strong&gt;ssh server&lt;/strong&gt; using the address that we gave him and asks &lt;strong&gt;ssh server&lt;/strong&gt; to bring the public key for user &lt;strong&gt;john&lt;/strong&gt; to open the lock (authenticate the user &lt;strong&gt;john&lt;/strong&gt; and provide him access to the remote server via SSH). &lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;ssh server&lt;/strong&gt; checks the list of public keys he has and brings the public key for &lt;strong&gt;john&lt;/strong&gt;. Both &lt;strong&gt;ssh client&lt;/strong&gt; and &lt;strong&gt;ssh server&lt;/strong&gt; then insert their respective public and private keys into the common lock. If the keys belong to the same pair, the lock will be opened and connection established. If the &lt;strong&gt;ssh server&lt;/strong&gt; does not have the public key for user &lt;strong&gt;john&lt;/strong&gt; then the lock does not open and authentication fails. &lt;/p&gt;

&lt;p&gt;The above analogy is an oversimplification. The actual process is somewhat more complex. If you wish to understand the details of how it actually works, I would recommend &lt;a href="https://www.digitalocean.com/community/tutorials/understanding-the-ssh-encryption-and-connection-process"&gt;this article on DigitalOcean&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  How does one usually SSH to an AWS EC2 instance
&lt;/h3&gt;

&lt;p&gt;When we create a new AWS EC2 instance for example using an Amazon Linux AMI or an Ubuntu Server AMI, at the last step we are asked a question about creating a new key pair or choosing an existing key pair. If you do it for the first time, you will need to create a new key pair. You provide a name for the key pair on this step. Let's say you provide the name as &lt;strong&gt;MyKeyPair&lt;/strong&gt; at this step.&lt;br&gt;
Before being able to proceed, you need to click the button &lt;strong&gt;Download Key Pair&lt;/strong&gt;. This generates a pair of public and private key and lets you download the private key as &lt;strong&gt;MyKeyPair.pem&lt;/strong&gt; file. After this when you click the &lt;strong&gt;Launch Instance&lt;/strong&gt; button, AWS automatically adds the public key of the key pair to the newly created EC2 instance. Public keys are located in the &lt;strong&gt;~/.ssh/authorized_keys&lt;/strong&gt; file. So, if you choose &lt;strong&gt;Amazon Linux AMI&lt;/strong&gt; while creating the EC2 instance, it will be added in the &lt;strong&gt;/home/ec2-user/.ssh/authorized_keys&lt;/strong&gt; file. Similarly, if you use &lt;strong&gt;Ubuntu Linux AMI&lt;/strong&gt; while creating a new EC2 instance, the public key will be added to the &lt;strong&gt;/home/ubuntu/.ssh/authorized_keys&lt;/strong&gt; file. First thing you need to do is to change the permissions of your private key file (MyKeyPair.pem). Navigate to the directory where your private key file is located. And then run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ chmod 400 MyKeyPair.pem
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This gives read-only access to your private key file to only you. Other than yourself, nobody else can read or write this file. Now, in order to SSH to an EC2 instance, we would execute the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh -i &amp;lt;path to MyKeyPair.pem&amp;gt; &amp;lt;username&amp;gt;@&amp;lt;ip address of the server&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, for example if the IP address of the server was say &lt;strong&gt;54.0.0.121&lt;/strong&gt; and we chose Ubuntu Linux while creating the EC2 instance, then the username will be &lt;strong&gt;ubuntu&lt;/strong&gt;. And our command becomes&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh -i MyKeyPair.pem ubuntu@54.0.0.121
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is assuming we are running this command from the directory containing our &lt;strong&gt;MyKeyPair.pem&lt;/strong&gt; file. If we are executing this command from some other directory then we will need to provide the correct path of the &lt;strong&gt;MyKeyPair.pem&lt;/strong&gt; file. Similarly, if we used Amazon Linux AMI while creating the EC2 instance, then username in that case becomes &lt;strong&gt;ec2-user&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;So, this explains how AWS generates public-private key pairs when you create an EC2 instance and how you can use the private key to connect to an EC2 instance. Next we will learn how to do the same using an SSH &lt;strong&gt;config&lt;/strong&gt; file. &lt;/p&gt;

&lt;h3&gt;
  
  
  How to use an SSH config file
&lt;/h3&gt;

&lt;p&gt;We have already discussed what an SSH config file is. Now we will create one and use that to connect to our EC2 instance that we connected earlier. SSH file needs to be in the &lt;strong&gt;~/.ssh&lt;/strong&gt; directory of the client machine. In our case, this will be our local machine. So, go to the &lt;strong&gt;~/.ssh&lt;/strong&gt; directory (create it if it does not exist) and then create a file with name &lt;strong&gt;config&lt;/strong&gt;. Open the file and add the following contents to it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host &amp;lt;an easy to remember name for the server&amp;gt;
  HostName &amp;lt;IP address of the server&amp;gt;
  IdentityFile &amp;lt;full path of the private Key file&amp;gt;
  User &amp;lt;username&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace the values in &lt;code&gt;&amp;lt;&amp;gt;&lt;/code&gt; with actual values in your case. For example, if we used Ubuntu Linux AMI and the IP address of the server is &lt;strong&gt;54.0.0.121&lt;/strong&gt; and the private key file (MyKeyPair.pem file) is located in &lt;strong&gt;/home/mandeep/private_keys&lt;/strong&gt; directory, then the content of the &lt;strong&gt;config&lt;/strong&gt; file becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host my-server
  HostName 54.0.0.121
  IdentityFile /home/mandeep/private_keys/MyKeyPair.pem
  User ubuntu
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let us see what each of these lines mean:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Host&lt;/strong&gt;: Here you need to provide any easy to remember name for the server. This is only for your reference&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hostname&lt;/strong&gt;: This is the fully qualified domain name or IP address of the server. In our example we have included IP address but it can also be fully qualified domain name like &lt;strong&gt;api.example.com&lt;/strong&gt;_. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IdentityFile&lt;/strong&gt;: Absolute path of the private key file&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;User&lt;/strong&gt;: username of the user logging in. This user must exist on the server and have the public key in the &lt;strong&gt;~/.ssh/authorized_keys&lt;/strong&gt; file.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once you save this file, you can easily connect to your EC2 instance by running the following command in the terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh my-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, it does not matter from which directory you execute this command. You can add as many configurations as you want in your config file. For example, if you wish to connect to another server with IP address say &lt;strong&gt;54.1.1.91&lt;/strong&gt; and private key as &lt;strong&gt;MySecondKey.pem&lt;/strong&gt; and username as &lt;strong&gt;ec2-user&lt;/strong&gt; then your &lt;strong&gt;config&lt;/strong&gt; file should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host my-server
  HostName 54.0.0.121
  IdentityFile /home/mandeep/private_keys/MyKeyPair.pem
  User ubuntu
Host my-second-server
  HostName 54.1.1.91
  IdentityFile /home/mandeep/private_keys/MySecondKey.pem
  User ec2-user
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you can connect to the &lt;strong&gt;my-second-server&lt;/strong&gt; by running the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh my-second-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. That's how you create an SSH &lt;strong&gt;config&lt;/strong&gt; file. Easy, isn't it? And once you start using it, it's hard to imagine living without it. It makes life so much better. &lt;/p&gt;

&lt;p&gt;So, we know how to SSH to a remote server using config files. What next?&lt;/p&gt;

&lt;p&gt;Well, there are plenty of configuration options one can provide in a &lt;strong&gt;config&lt;/strong&gt; file and discussing all of them is beyond the scope of this tutorial. You can refer to &lt;a href="https://www.ssh.com/ssh/config/"&gt;the documentation here&lt;/a&gt; for the complete list of options, but I will be discussing the two options that I usually find quite handy:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;LocalForward&lt;/li&gt;
&lt;li&gt;ForwardAgent&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  LocalForward or Local Port Forwarding
&lt;/h3&gt;

&lt;p&gt;Let us discuss this with an example. Consider a scenario that you have a remote server with the domain name &lt;strong&gt;redis.mydomain.com&lt;/strong&gt;. And let us say we are running some process on this server which is not accessible publicly. For example, let us say we are running a redis server on this remote server on port &lt;strong&gt;6379&lt;/strong&gt; but it can only be accessed once you login to the remote server and not from outside. Now let's say our requirement is that we need to access this remote redis server in a script running on our local machine. How do we do this? &lt;/p&gt;

&lt;p&gt;SSH tunneling allows us to map a port from our local machine to a &lt;strong&gt;ip address:port&lt;/strong&gt; on the remote server. For example, we can map the port &lt;strong&gt;6389&lt;/strong&gt; on our local machine to the address  &lt;strong&gt;localhost:6379&lt;/strong&gt; on the remote server. After doing this, our local machine thinks that the redis server (which is actually running on remote server on &lt;strong&gt;localhost:6379&lt;/strong&gt;) is running on our local machine on port &lt;strong&gt;6389&lt;/strong&gt;. So, when you hit &lt;strong&gt;localhost:6389&lt;/strong&gt; on your local machine, you are actually hitting the redis server running on the remote server on port &lt;strong&gt;6379&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How do we do this using our SSH config file?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We just need to add an additional property of &lt;strong&gt;LocalForward&lt;/strong&gt;. Here is an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host Redis-Server
  Hostname redis.mydomain.com
  IdentityFile /home/mandeep/private_keys/RedisServerKey.pem
  Localforward 6389 localhost:6379
  User ubuntu
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This approach comes quite handy when you want to access a server which is a part of a VPC (Virtual Private Cloud) and not accessible publicly. For example, an Elasticache instance, an RDS instance, etc. &lt;/p&gt;

&lt;h3&gt;
  
  
  ForwardAgent
&lt;/h3&gt;

&lt;p&gt;This property allows your SSH session to acquire the credentials of your local machine. Consider a scenario where you have a private Git repository on Github. You can access the repository either via &lt;strong&gt;HTTPS&lt;/strong&gt; using username and password or by using &lt;strong&gt;SSH&lt;/strong&gt; using private key. Username and password approach is less secure and not recommended. For accessing your repo via SSH, what we typically do is we create a private key public key pair which are stored in &lt;strong&gt;~/.ssh&lt;/strong&gt; directory as &lt;strong&gt;id_rsa&lt;/strong&gt; (private key) and &lt;strong&gt;id_rsa.pub&lt;/strong&gt; (publick key) files. Once we add our public key (&lt;strong&gt;id_rsa.pub&lt;/strong&gt;) to our Github account, we can access our repository via SSH. This works well with our local machine. Now consider a scenario where you need to SSH to a remote server and access the Git repository from that remote server. You have two options here. One is to copy your private key (&lt;strong&gt;id_rsa&lt;/strong&gt;) file and put it in the &lt;strong&gt;~/.ssh&lt;/strong&gt; directory on the remote server. This is a bad approach since you are not supposed to share your private key file. Another approach would be to generate a new key pair on the server and add the public key of that pair to the Github repo. There is a problem with both the approaches. Anyone who can SSH to the remote server will be able to access the Git repository. Let's say we don't want that. Let's say we only want the developers who have access to the repo through their own private key files should be able to access the repo. Anybody else who does not have access to the repo but can SSH to the remote server should not be able to access the repo from there. This is where the &lt;strong&gt;ForwardAgent&lt;/strong&gt; property comes quite handy. You can add this to your &lt;strong&gt;config&lt;/strong&gt; file as shown below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host App-Server
  Hostname app.mydomain.com
  IdentityFile /home/mandeep/private_keys/AppServerKey.pem
  User ubuntu
  ForwardAgent yes   
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After adding this property to your &lt;strong&gt;config&lt;/strong&gt; file, when you SSH to the server using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ ssh App-Server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then the SSH terminal that gets opened acquires the credentials (&lt;strong&gt;id_rsa&lt;/strong&gt;) file from your local machine. Now, even if there is no &lt;strong&gt;~/.ssh/id_rsa&lt;/strong&gt; file on the remote server, any Git repository that you can access on your local machine, you can also access that repository from the remote server.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;With this tutorial we learned the importance of an SSH config file and saw how it can make our lives easier. If you found this tutorial helpful and believe that it can help others, please share it on social media using the social media sharing buttons below. If you like my tutorials and my writing style, follow me on twitter. If you feel I have made any mistakes or any information in this article is incorrect, feel free to mention those in the comments below. Thanks! Happy coding :-)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ssh</category>
    </item>
    <item>
      <title>Setting up Elasticsearch and Kibana on Docker with X-Pack security enabled</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Wed, 25 Sep 2019 17:14:46 +0000</pubDate>
      <link>https://dev.to/mandeepm91/setting-up-elasticsearch-and-kibana-on-docker-with-x-pack-security-enabled-48dm</link>
      <guid>https://dev.to/mandeepm91/setting-up-elasticsearch-and-kibana-on-docker-with-x-pack-security-enabled-48dm</guid>
      <description>&lt;p&gt;This tutorial assumes that you are familiar with Elasticsearch and Kibana and have some understanding of Docker. Before diving into the objective of this article, I would like to provide a brief introduction about X-Pack and go over some of the latest changes in Elasticsearch version 6.8 which allow us to use the security features of X-Pack for free with the basic license. &lt;/p&gt;

&lt;h2&gt;
  
  
  X-Pack Security and Elasticsearch 6.8
&lt;/h2&gt;

&lt;p&gt;X-Pack is a set of features that extend the Elastic Stack, that is Elasticsearch, Kibana, Logstash and Beats. This includes features like security, monitoring, machine learning, reporting, etc. In this article, we are mainly concerned with the security features of X-Pack. &lt;/p&gt;

&lt;p&gt;X-Pack security makes securing you Elasticsearch cluster very easy and highly customizable. It allows you to setup authentication for your Elasticsearch cluster, create different users with different credentials and different levels of access. It also allows you to create different roles  and assign similar users to same role. For example, if you want to grant read-only access to certain users to certain indices of your cluster but want to ensure they cannot write to those indices, you can easily achieve that with X-pack security. And this is just the tip of the iceberg. You can check-out the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api.html" rel="noopener noreferrer"&gt;security API here&lt;/a&gt; for a more detailed view of what all you can do with it. &lt;/p&gt;

&lt;p&gt;But all these features were not always available for free. Prior to version 6.8, security was not a part of the Basic license. I'll quickly explain what this means. The Elastic Stack has 4 different types of licenses that you can see &lt;a href="https://www.elastic.co/subscriptions" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open Source&lt;/li&gt;
&lt;li&gt;Basic&lt;/li&gt;
&lt;li&gt;Gold&lt;/li&gt;
&lt;li&gt;Platinum&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Gold and Platinum are paid licenses whereas Open Source and Basic are free. If you visit the above mentioned link, you can see what all features are available under which license. If you see the security dropdown on that page, you can see that some of the security features are available as a part of Basic license. As of writing this article, following security features are availale under Basic license:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-09-12-00-52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-09-12-00-52.png" alt="Screenshot-from-2019-06-09-12-00-52"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now this list reflects the latest version of Elastic Stack which at the time of writing this article, is version 7.1. &lt;strong&gt;Security was made available under basic license from version 6.8 onwards&lt;/strong&gt;. This is important because it means that if you want to be able to use security features in your Elasticsearch setup for free, you need to have version 6.8 onwards for this. &lt;/p&gt;

&lt;p&gt;And that's what we will be using for this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Objective
&lt;/h2&gt;

&lt;p&gt;The objective of this article is to setup Elasticsearch and Kibana using Docker Compose with security features enabled. We will be setting up basic authentication on Elasticsearch so that all the API calls will need to include the Bearer token. Also, Kibana UI will require the username and password to login. For our setup, we will be using Docker Compose which makes our entire setup very easy to depoy anywhere and scale. I'll be using Ubuntu 18.04 for this tutorial but the steps will remain more or less same on any other unix based systems and might not be too different on a Windows based system as well. The only piece of code that we will be writing in this article is a &lt;code&gt;docker-compose.yml&lt;/code&gt; file. We will be starting with a minimal &lt;code&gt;docker-compose.yml&lt;/code&gt; file to get the elasticsearch and kibana setup up and running and then gradually we will tweak it to enable the security features.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre requisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Linux based OS&lt;/li&gt;
&lt;li&gt;Docker and Docker Compose installed&lt;/li&gt;
&lt;li&gt;Basic understanding of Docker and Docker Compose&lt;/li&gt;
&lt;li&gt;Knowledge of Elasticsearch and Kibana&lt;/li&gt;
&lt;li&gt;Experience with Linux command line&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you are using some other operating system, you can follow the instructions specific to that OS but the process remains more or less the same. &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1 - Create a basic docker-compose.yml file for Elasticsearch and Kibana
&lt;/h3&gt;

&lt;p&gt;In this step we will create our &lt;code&gt;docker-compose.yml&lt;/code&gt; file with two services, &lt;code&gt;elasticsearch&lt;/code&gt; and &lt;code&gt;kibana&lt;/code&gt; and map their respective ports to the host OS&lt;/p&gt;

&lt;p&gt;Let us first start with creating a directory for our project. Open your terminal and type the following&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cd
$ mkdir elasticsearch-kibana-setup
$ cd elasticsearch-kibana-setup
$ touch docker-compose.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then open the newly created &lt;code&gt;docker-compose.yml&lt;/code&gt; file and paste the following lines in it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.8.0
    ports:
      - 9200:9200

  kibana:
    depends_on:
      - elasticsearch  
    image: docker.elastic.co/kibana/kibana:6.8.0
    ports:
      - 5601:5601
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The official docker images for Elastic Stack can be found &lt;a href="https://www.docker.elastic.co/#" rel="noopener noreferrer"&gt;here &lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As discussed in the beginning of this article, we will be using version 6.8 for this setup. If you visit the above link and click on Elasticsearch image 6.8 to expand, you'll see two images:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker pull docker.elastic.co/elasticsearch/elasticsearch:6.8.0   
docker pull docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-09-15-29-52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-09-15-29-52.png" alt="Screenshot-from-2019-06-09-15-29-52"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see that one of them has &lt;code&gt;oss&lt;/code&gt; tag while the other does not. The difference between these two images is the license. The oss one comes with open source license whereas the non-oss one comes with the basic license. Since the x-pack security features are only available with the basic license, we will be using the non-oss version. Please note that this is also free as explained in the beginning of the article.&lt;/p&gt;

&lt;p&gt;Apart from specifying the images, we are mapping the ports of the containers to the ports on the host machine. Elasticsearch runs on port 9200 and Kibana on port 5601 so we are mapping both these ports to the corresponding ports on the host machine. You can map them to some other port as well. The syntax remains the same:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;Host Port&amp;gt;:&amp;lt;Container Port&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, for instance, if you want to access elasticsearch on port 8080 of your host machine, you'll need to specify the config as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;8080:9200
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For now we'll be mapping it to 9200 in this article. Also the &lt;code&gt;depends_on&lt;/code&gt; setting in &lt;code&gt;kibana&lt;/code&gt; service ensures that it is not started until &lt;code&gt;elasticsearch&lt;/code&gt; service is up and running. So, let's try to start our setup with the above settings by running the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will start pulling the images from docker registry and create the containers. This may take a while depending on whether you already have the images on your machine or not and also depending on your internet speed. After the images have been pulled, you'll start seeing container logs which will take a few more seconds. Once both Elasticsearch and Kibana are ready, you'll see something like this in your console:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;elasticsearch_1  | [2019-06-09T10:14:21,167][INFO ][o.e.c.r.a.AllocationService] [pKPbPLz] Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.kibana_1][0]] ...]).
kibana_1         | {"type":"log","@timestamp":"2019-06-09T10:14:21Z","tags":["info","migrations"],"pid":1,"message":"Pointing alias .kibana to .kibana_1."}
kibana_1         | {"type":"log","@timestamp":"2019-06-09T10:14:21Z","tags":["info","migrations"],"pid":1,"message":"Finished in 175ms."}
kibana_1         | {"type":"log","@timestamp":"2019-06-09T10:14:21Z","tags":["listening","info"],"pid":1,"message":"Server running at http://0:5601"}
elasticsearch_1  | [2019-06-09T10:14:21,282][INFO ][o.e.c.m.MetaDataIndexTemplateService] [pKPbPLz] adding template [kibana_index_template:.kibana] for index patterns [.kibana]
elasticsearch_1  | [2019-06-09T10:14:21,326][INFO ][o.e.c.m.MetaDataIndexTemplateService] [pKPbPLz] adding template [kibana_index_template:.kibana] for index patterns [.kibana]
elasticsearch_1  | [2019-06-09T10:14:21,343][INFO ][o.e.c.m.MetaDataIndexTemplateService] [pKPbPLz] adding template [kibana_index_template:.kibana] for index patterns [.kibana]
kibana_1         | {"type":"log","@timestamp":"2019-06-09T10:14:22Z","tags":["status","plugin:spaces@6.8.0","info"],"pid":1,"state":"green","message":"Status changed from yellow to green - Ready","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Look for the lines where it says status has changed to &lt;code&gt;green&lt;/code&gt;. This means that our setup is ready. If you don't see such lines and see any error message it means something went wrong. You'll need to debug the issue and resolve it.&lt;/p&gt;

&lt;p&gt;Once the services are up and running, open your browser and open the url &lt;a href="http://localhost:9200/" rel="noopener noreferrer"&gt;http://localhost:9200/&lt;/a&gt; and you will see something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "name" : "pKPbPLz",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "AjqbFZ0qRF-X0_TQZqWIZA",
  "version" : {
    "number" : "6.8.0",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "65b6179",
    "build_date" : "2019-05-15T20:06:13.172855Z",
    "build_snapshot" : false,
    "lucene_version" : "7.7.0",
    "minimum_wire_compatibility_version" : "5.6.0",
    "minimum_index_compatibility_version" : "5.0.0"
  },
  "tagline" : "You Know, for Search"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's elasticsearch. Also, if you navigate to &lt;a href="http://localhost:5601/" rel="noopener noreferrer"&gt;http://localhost:5601/&lt;/a&gt; you should see the Kibana console. So, with just 13 lines of code in your &lt;code&gt;docker-compose.yml&lt;/code&gt; file, you have setup a single node cluster of Elasticsearch and Kibana. &lt;/p&gt;

&lt;p&gt;Now although this works, there is a security challenge if you want to deploy it to production. If your server is not a part of a VPC (Virtual Private Cloud) and the ports 9200 and 5601 are accessible to the world, your Elasticsearch and Kibana services can be accessed by anyone. There is no authorization so anyone make any changes to your cluster using the Elasticsearch API directly or through Kibana UI. What if we wanted to keep those ports accessible but require some sort of authentication so that only those who have the right credentials can access our Elasticsearch instance or login to the Kibana UI? Also, what if we want to ensure that certain users should only have limited set of priveleges? For example, we want certain users to be able to search any index in our Elasticsearch cluster but not be able to create any new index or drop any index or change any mapping or write to an index. Or let's say you don't want your Elasticsearch instance directly accessible to the rest of the world but want to keep the Kibana UI accessible and behind authentication and you want different users of Kibana UI to have different access levels. All of this can be achieved with X-Pack security and that's what we will be exploring next. &lt;/p&gt;

&lt;p&gt;Go back to the terminal window where you ran the &lt;code&gt;docker-compose up&lt;/code&gt; command and press &lt;code&gt;CTRL+C&lt;/code&gt; to stop the containers and tear down the setup.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2 - Customize Elasticsearch and Kibana services with environment variables
&lt;/h3&gt;

&lt;p&gt;In order to enable X-Pack security, we will need to customize our elasticsearch and kibana services. Elasticsearch settings can be customized via &lt;code&gt;elasticsearch.yml&lt;/code&gt; file and Kibana settings can be customized via &lt;code&gt;kibana.yml&lt;/code&gt; file. There are many ways to change this while using docker. We can pass enviroment variables via our &lt;code&gt;docker-compose.yml&lt;/code&gt; file. Although this would normally be an ideal way, but the way Elasticsearch and Kibana env variables are passed is not the same and can cause problems in certain deployment environments. You can read more about it &lt;a href="https://github.com/elastic/elasticsearch-docker/issues/135#issuecomment-346227008" rel="noopener noreferrer"&gt;here&lt;/a&gt;. For this tutorial, we will be creating custom &lt;code&gt;elasticsearch.yml&lt;/code&gt; and &lt;code&gt;kibana.yml&lt;/code&gt; files and bind mount these to their respective containers, overriding the default files in those container. &lt;/p&gt;

&lt;p&gt;This will become more clear in next steps. First, create two files &lt;code&gt;elasticsearch.yml&lt;/code&gt; and &lt;code&gt;kibana.yml&lt;/code&gt; in the same directory as our &lt;code&gt;docker-compose.yml&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ touch elasticsearch.yml
$ touch kibana.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then open &lt;code&gt;elasticsearch.yml&lt;/code&gt; and paste the following lines in it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cluster.name: my-elasticsearch-cluster
network.host: 0.0.0.0
xpack.security.enabled: true
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we are setting the name of our cluster to &lt;code&gt;my-elasticsearch-cluster&lt;/code&gt;. The setting &lt;code&gt;network.host: 0.0.0.0&lt;/code&gt; means that elasticsearch will be accessible from all IP addresses on the host machine if the host machine has more than one network interface. And the last setting is to enable X-Pack security. This ensures that anyone trying to access our Elasticsearch instance must provide the authentication token. &lt;/p&gt;

&lt;p&gt;Now open the &lt;code&gt;kibana.yml&lt;/code&gt; file and paste the following lines in it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;server.name: kibana
server.host: "0"
elasticsearch.hosts: [ "http://elasticsearch:9200" ]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we are setting the server name. The &lt;code&gt;server.host: "0"&lt;/code&gt; means that the Kibana instance should be accessible from all the IP addresses on the host machine if the host machine has more than one network interface. And the last setting &lt;code&gt;elasticsearch.hosts&lt;/code&gt; includes the list of addresses of the Elasticsearch nodes. Kibana instance can reach out to the Elasticsearch instance by using the address &lt;code&gt;http://elasticsearch:9200&lt;/code&gt;. This is achieved by Docker Compose. If you have multiple services in your compose file, containers belonging to one service can reach out to container of other services by using the other service's name. You don't even need to expose the ports for this. So, in our &lt;code&gt;docker-compose.yml&lt;/code&gt; file, even if we had not mapped the ports for Elasticsearch, our Kibana instance would still be able to reach out to Elasticsearch instance at &lt;code&gt;http://elasticsearch:9200&lt;/code&gt;. However, in that case, we won't be able to connect to our Elasticsearch instance from our host machine. I won't be diving further deep into the details of how networking works in Docker because that will be beyond the scope of this article. But I would definitely suggest you to go through the official docs to get a better understanding.  &lt;/p&gt;

&lt;p&gt;Ok, so now that we have our config files ready, we need to bind mount them to their respective containers in our &lt;code&gt;docker-compose.yml&lt;/code&gt; file. So open the &lt;code&gt;docker-compose.yml&lt;/code&gt; file and change it to look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.8.0
    ports:
      - 9200:9200
    volumes:
      - ./elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml

  kibana:
    depends_on:
      - elasticsearch  
    image: docker.elastic.co/kibana/kibana:6.8.0
    ports:
      - 5601:5601
    volumes:
      - ./kibana.yml:/usr/share/kibana/config/kibana.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The only changes we have made here is that we have added the &lt;code&gt;volumes&lt;/code&gt; section. By using &lt;code&gt;volumes&lt;/code&gt; we can map a directory or an individual file from host machine to a directory or a file on the container. Here we are mapping individual files only. The default location of config file in Elasticsearch container is &lt;code&gt;/usr/share/elasticsearch/config/elasticsearch.yml&lt;/code&gt; and we are replacing it with the &lt;code&gt;elasticsearch.yml&lt;/code&gt; file that we created earlier. Similarly, we are replacing the default &lt;code&gt;kibana.yml&lt;/code&gt; file at &lt;code&gt;/usr/share/kibana/config/kibana.yml&lt;/code&gt; with our newly created file. With these changes, let's try to start our docker compose setup again by running the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will most likely give you an error. If you see the elasticsearch logs (lines starting with &lt;code&gt;elasticsearch_1  |&lt;/code&gt;), you might see some error like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;elasticsearch_1  | [1]: Transport SSL must be enabled if security is enabled on a [basic] license. Please set [xpack.security.transport.ssl.enabled] to [true] or disable security by setting [xpack.security.enabled] to [false]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This means that Elasticsearch won't start since the initial checks have failed. Consequently, Kibana won't be able to connect to it and you'll see something like this in Kibana logs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kibana_1         | {"type":"log","@timestamp":"2019-06-11T17:31:14Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Press &lt;code&gt;Ctrl+C&lt;/code&gt; to stop the containers and tear down the setup because this ain't working and we gotta fix it.&lt;/p&gt;

&lt;p&gt;In order to get elasticsearch working, we will need to enable SSL and also install the SSL certificate in our elasticsearch container. I will be walking through the process of creating a new certificate and using that. If you already have a certificate file, you can skip that part. For this, we will need to take a step back and disable x-pack security on our elasticsearch instance so that we can get it up and running and then we will get inside our container shell and generate the certificate.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3 - Create SSL certificate for Elasticsearch and enable SSL
&lt;/h3&gt;

&lt;p&gt;First, we need to disable x-pack security temporarily so that we can get our Elasticsearch container up and running. So, open the &lt;code&gt;elasticsearch.yml&lt;/code&gt; file and disable x-pack security by changing the following line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;xpack.security.enabled: false
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then bring up the containers again by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should work fine now and bring up our Elasticearch and Kibana services just like before. Now, we need to generate the certificates and we will be using &lt;code&gt;elasticsearch-certutil&lt;/code&gt; utility. For this, we will need to get inside our docker container running elasticsearch service. This is really easy using &lt;code&gt;docker-compose&lt;/code&gt;. Think of it like this, we can execute any command inside a docker container by using the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose exec &amp;lt;service name&amp;gt; &amp;lt;command&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And if we want to get inside the container's shell, we essentially want to execute the &lt;code&gt;bash&lt;/code&gt; command on our container. So, our command becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose exec elasticsearch bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here &lt;code&gt;elasticsearch&lt;/code&gt; is our service and &lt;code&gt;bash&lt;/code&gt; is our command.  We need to do this while the container is running so open another terminal window and paste the above command (make sure to run this command from the same directory where your &lt;code&gt;docker-compose.yml&lt;/code&gt; file is located)&lt;/p&gt;

&lt;p&gt;Once you're inside the container, your shell prompt should look something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[root@c9f915e86309 elasticsearch]#
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now run the following command here:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[root@c9f915e86309 elasticsearch]# bin/elasticsearch-certutil ca
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will generate some warnings describe what it is going to do. I recommend you read that. And it will prompt you for file name and password. Just press &lt;code&gt;ENTER&lt;/code&gt; for both to proceed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Please enter the desired output file [elastic-stack-ca.p12]: 
Enter password for elastic-stack-ca.p12 : 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create a file &lt;code&gt;elastic-stack-ca.p12&lt;/code&gt; in the directory from which you ran the above command. You can check by running the &lt;code&gt;ls&lt;/code&gt; command. This is the certificate authority we will be using to create the certificate. Now, run the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[root@c9f915e86309 elasticsearch]# bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will again raise some warnings and describe what it is going to do. I recommend you read that too. And it will prompt you for password and file name. Press &lt;code&gt;ENTER&lt;/code&gt; at all the steps to proceed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Enter password for CA (elastic-stack-ca.p12) : 
Please enter the desired output file [elastic-certificates.p12]: 
Enter password for elastic-certificates.p12 : 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create the &lt;code&gt;elastic-certificates.p12&lt;/code&gt; which is what we need. We need this file outside the container on the host machine because it will be vanish once we destroy our container. This file is in PKCS12 format which includes both the certificate as well as the private key. In order to copy this file outside the container to host machine, press &lt;code&gt;CTRL+D&lt;/code&gt; to first exit the container&lt;/p&gt;

&lt;p&gt;And then run the following command on your host machine (from the same directory where &lt;code&gt;docker-compose.yml&lt;/code&gt; file is present)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker cp "$(docker-compose ps -q elasticsearch)":/usr/share/elasticsearch/elastic-certificates.p12 .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above command might seem a bit tricky to some so I will add a bit of explanation here. For those of you who understand how it works can proceed to Step 4.&lt;/p&gt;

&lt;p&gt;Let us first see what &lt;code&gt;docker-compose ps&lt;/code&gt; does. If you run the command you'll see output like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;                      Name                                    Command               State                Ports              
----------------------------------------------------------------------------------------------------------------------------
elasticsearch-kibana-setup_elasticsearch_1   /usr/local/bin/docker-entr ...   Up      0.0.0.0:9200-&amp;gt;9200/tcp, 9300/tcp
elasticsearch-kibana-setup_kibana_1          /usr/local/bin/kibana-docker     Up      0.0.0.0:5601-&amp;gt;5601/tcp          

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This shows all the docker containers running or stopped which are being managed by our &lt;code&gt;docker-compose.yml&lt;/code&gt; file.&lt;/p&gt;

&lt;p&gt;If you check the help for this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose ps --help
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will see output like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;List containers.

Usage: ps [options] [SERVICE...]

Options:
    -q, --quiet          Only display IDs
    --services           Display services
    --filter KEY=VAL     Filter services by a property
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can see that by using &lt;code&gt;-q&lt;/code&gt; flag, we can get just the id of the container. And also you can see that by providing the service name, we can limit the output to just the service we are interested in. So, if we want to get the id of the elasticsearch container, we need to run the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose ps -q elasticsearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should get you the id of the elasticsearch container. &lt;/p&gt;

&lt;p&gt;Now, if we go back to our &lt;code&gt;docker cp&lt;/code&gt; command above, you can check the syntax of that command by using help again:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker cp --help
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should display the help:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
Usage:  docker cp [OPTIONS] CONTAINER:SRC_PATH DEST_PATH|-
    docker cp [OPTIONS] SRC_PATH|- CONTAINER:DEST_PATH

Copy files/folders between a container and the local filesystem

Options:
  -a, --archive       Archive mode (copy all uid/gid information)
  -L, --follow-link   Always follow symbol link in SRC_PATH

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can see that we need to specify the command as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker cp &amp;lt;container id&amp;gt;:&amp;lt;src path&amp;gt; &amp;lt;dest path on host&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our source path in this case is &lt;code&gt;/usr/share/elasticsearch/elastic-certificates.p12&lt;/code&gt; on the elasticsearch container. And we are getting the id of the elasticsearch container by using the &lt;code&gt;docker-compose ps -q elasticsearch&lt;/code&gt; command. And we need to copy the file to the current directory on host so our destination path is &lt;code&gt;.&lt;/code&gt;. Hence the command becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker cp "$(docker-compose ps -q elasticsearch)":/usr/share/elasticsearch/elastic-certificates.p12 .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will also copy the CA file by running the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker cp "$(docker-compose ps -q elasticsearch)":/usr/share/elasticsearch/elastic-stack-ca.p12 .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we have our certificate file on our host machine, we will be bind mounting it to our container just like the way we did for &lt;code&gt;elasticsearch.yml&lt;/code&gt; file. So, if you already have an SSL certificate you can use that too in place of this one.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4 - Installing the SSL certificate on Elasticsearch and enabling TLS in config
&lt;/h3&gt;

&lt;p&gt;Now that we have the SSL certificate available, we can enable the x-pack security on our elasticsearch node and also enable TLS. First, we need to bind mount our certificate from host machine to container. First, go back to the terminal where you ran &lt;code&gt;docker-compose up&lt;/code&gt; command and press &lt;code&gt;CTRL+C&lt;/code&gt; to stop the containers. Then open the &lt;code&gt;docker-compose.yml&lt;/code&gt; file and change it so that it look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.8.0
    ports:
      - 9200:9200
    volumes:
      - ./elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
      - ./elastic-certificates.p12:/usr/share/elasticsearch/config/elastic-certificates.p12

  kibana:
    depends_on:
      - elasticsearch
    image: docker.elastic.co/kibana/kibana:6.8.0
    ports:
      - 5601:5601
    volumes:
      - ./kibana.yml:/usr/share/kibana/config/kibana.yml

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, open your &lt;code&gt;elasticsearch.yml&lt;/code&gt; file and change it to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cluster.name: my-elasticsearch-cluster
network.host: 0.0.0.0
xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.keystore.type: PKCS12
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: elastic-certificates.p12
xpack.security.transport.ssl.truststore.type: PKCS12
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;First 3 lines are same as before (we have changed &lt;code&gt;xpack.security.enabled&lt;/code&gt; to &lt;code&gt;true&lt;/code&gt; again). Rest of the lines denote the SSL settings and location of our certificate and private key, which is the same. You can check out all the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/security-settings.html" rel="noopener noreferrer"&gt;security settings here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Once this is done, go back to terminal and bring up the container again&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, what do you see? Still not working eh? This is because now Kibana is not able to connect to our Elasticsearch instance because now we have security enabled but haven't configured the credentials on Kibana. So, you'll see continous logs like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kibana_1         | {"type":"log","@timestamp":"2019-06-11T19:03:35Z","tags":["warning","task_manager"],"pid":1,"message":"PollError [security_exception] missing authentication token for REST request [/_template/.kibana_task_manager?include_type_name=true&amp;amp;filter_path=*.version], with { header={ WWW-Authenticate=\"Basic realm=\\\"security\\\" charset=\\\"UTF-8\\\"\" } }"}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Also, if you open your web browser and go to &lt;a href="http://localhost:9200" rel="noopener noreferrer"&gt;http://localhost:9200&lt;/a&gt; you will see a prompt for username and password. And if you press &lt;code&gt;ESC&lt;/code&gt;, you get this error:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "error": {
    "root_cause": [
      {
        "type": "security_exception",
        "reason": "missing authentication token for REST request [/]",
        "header": {
          "WWW-Authenticate": "Basic realm=\"security\" charset=\"UTF-8\""
        }
      }
    ],
    "type": "security_exception",
    "reason": "missing authentication token for REST request [/]",
    "header": {
      "WWW-Authenticate": "Basic realm=\"security\" charset=\"UTF-8\""
    }
  },
  "status": 401
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And if you try visiting &lt;a href="http://localhost:5601" rel="noopener noreferrer"&gt;http://localhost:5601&lt;/a&gt; you will get the error:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Kibana server is not ready yet
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, we have solved one part of the problem. We have secured our Elasticsearch instance and nobody can access it without providing the correct credentials. But we don't know what the correct credentials are. We will be setting those up in the next step and configuring Kibana to use those. For now, keep the &lt;code&gt;docker-compose up&lt;/code&gt; command running since we need to go inside the Elasticsearch container again&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5 - Generate default passwords and configure the credentials in Kibana
&lt;/h3&gt;

&lt;p&gt;Before we generate the passwords for built-in accounts of Elastic stack, we will first need to change our &lt;code&gt;docker-compose.yml&lt;/code&gt; file to bind mount the data volume of Elasticsearch. Up until now, the storage of our containers has been temporary. It means that once we destroy the containers, all the data inside of them gets destroyed as well. So if you created any indices, users, etc in Elasticsearch, they will no longer persist once you do &lt;code&gt;docker-compose down&lt;/code&gt; to bring down the services. That's not something we would want in production. We want to ensure that the data changes persist between container restarts. For that, we will need to bind mount the data directory from elasticsearch container to a directory on the host machine. &lt;/p&gt;

&lt;p&gt;First, bring down all the running containers by executing the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose down
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create a directory called &lt;code&gt;docker-data-volumes&lt;/code&gt; in the same directory where your &lt;code&gt;docker-compose.yml&lt;/code&gt; file is located. You can give it any other name but for this tutorial we will call it &lt;code&gt;docker-data-volumes&lt;/code&gt;. Inside that directory, create another directory called &lt;code&gt;elasticsearch&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir docker-data-volumes
mkdir docker-data-volumes/elasticsearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now under the &lt;code&gt;volumes&lt;/code&gt; section of &lt;code&gt;elasticsearch&lt;/code&gt; service in your &lt;code&gt;docker-compose.yml&lt;/code&gt; file, add the following line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      - ./docker-data-volumes/elasticsearch:/usr/share/elasticsearch/data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As explained earlier, when we need to bind mount a file or directory from host machine to container, we specify the &lt;code&gt;&amp;lt;host path&amp;gt;:&amp;lt;container path&amp;gt;&lt;/code&gt;. The default path for data inside an elasticsearch container is &lt;code&gt;/usr/share/elasticsearch/data&lt;/code&gt; and we are binding it to the directory &lt;code&gt;./docker-data-volumes/elasticsearch&lt;/code&gt; on host machine. So your &lt;code&gt;docker-compose.yml&lt;/code&gt; file should now look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.8.0
    ports:
      - 9200:9200
    volumes:
      - ./elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
      - ./elastic-certificates.p12:/usr/share/elasticsearch/config/elastic-certificates.p12
      - ./docker-data-volumes/elasticsearch:/usr/share/elasticsearch/data

  kibana:
    depends_on:
      - elasticsearch
    image: docker.elastic.co/kibana/kibana:6.8.0
    ports:
      - 5601:5601
    volumes:
      - ./kibana.yml:/usr/share/kibana/config/kibana.yml

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Bring up the containers by running&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;While &lt;code&gt;docker-compose up&lt;/code&gt; is running in one terminal, open another terminal to get inside the elasticsearch container by running the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose exec elasticsearch bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run the following command to generate passwords for all the built-in users:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[root@c9f915e86309 elasticsearch]# bin/elasticsearch-setup-passwords auto
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note them down and keep them somewhere safe. Exit the container by pressing &lt;code&gt;CTRL+D&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now open the &lt;code&gt;kibana.yml&lt;/code&gt; file and change it to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;server.name: kibana
server.host: "0"
elasticsearch.hosts: [ "http://elasticsearch:9200" ]
elasticsearch.username: kibana
elasticsearch.password: &amp;lt;kibana password&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You need to put the password for kibana user here in the &lt;code&gt;elasticsearch.password&lt;/code&gt; setting. &lt;/p&gt;

&lt;p&gt;Go to the terminal where &lt;code&gt;docker-compose up&lt;/code&gt; was running and press &lt;code&gt;CTRL+C&lt;/code&gt; to bring the containers down. And then run the command again&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker-compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should bring up both the services, elasticsearch and kibana. Now, if you open your browser and visit &lt;a href="http://localhost:9200" rel="noopener noreferrer"&gt;http://localhost:9200&lt;/a&gt; it will again prompt you for username and password. Here, enter the username as &lt;code&gt;elastic&lt;/code&gt; and enter the password for the user &lt;code&gt;elastic&lt;/code&gt; that you got earlier. You should be able to see the output like this on successful authentication:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "name" : "1mG1JlU",
  "cluster_name" : "my-elasticsearch-cluster",
  "cluster_uuid" : "-mEbLeYVRb-XqA24yq6D1w",
  "version" : {
    "number" : "6.8.0",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "65b6179",
    "build_date" : "2019-05-15T20:06:13.172855Z",
    "build_snapshot" : false,
    "lucene_version" : "7.7.0",
    "minimum_wire_compatibility_version" : "5.6.0",
    "minimum_index_compatibility_version" : "5.0.0"
  },
  "tagline" : "You Know, for Search"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Also, if you open &lt;a href="http://localhost:5601" rel="noopener noreferrer"&gt;http://localhost:5601&lt;/a&gt; you will see the Kibana console but now it will ask for username and password:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-12-01-00-10.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F06%2FScreenshot-from-2019-06-12-01-00-10.png" alt="Screenshot-from-2019-06-12-01-00-10"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here also enter the username as &lt;code&gt;elastic&lt;/code&gt; and password for the same. You will see the Kibana console on successful authentication. &lt;/p&gt;

&lt;p&gt;If you followed all the steps correctly till now, you should be able to login to Kibana console now.&lt;/p&gt;

&lt;p&gt;Now if you click on the &lt;strong&gt;Management&lt;/strong&gt; tab in the sidebar, you will see &lt;strong&gt;Security&lt;/strong&gt; section on the right hand side panel. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F07%2FScreenshot-from-2019-07-27-16-15-10.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F07%2FScreenshot-from-2019-07-27-16-15-10.png" alt="Screenshot-from-2019-07-27-16-15-10"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see the users, roles, create new role or create new user over here. There is a lot you can do here and I would recommend you play with it for a while to get a feel of it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This completes our tutorial and our setup of an Elasticsearch and Kibana cluster with basic license (Free) using docker-compose, with X-Pack security enabled. I hope you found it helpful and if you have any suggestions or find any errors, feel free to comment below. &lt;/p&gt;

&lt;p&gt;Happy Coding :-)&lt;/p&gt;

</description>
      <category>elasticsearch</category>
      <category>xpack</category>
      <category>docker</category>
      <category>kibana</category>
    </item>
    <item>
      <title>How to read or modify spreadsheets from Google Sheets using Node.js ?</title>
      <dc:creator>Mandeep Singh Gulati</dc:creator>
      <pubDate>Wed, 25 Sep 2019 17:12:40 +0000</pubDate>
      <link>https://dev.to/mandeepm91/how-to-read-or-modify-spreadsheets-from-google-sheets-using-node-js-4c30</link>
      <guid>https://dev.to/mandeepm91/how-to-read-or-modify-spreadsheets-from-google-sheets-using-node-js-4c30</guid>
      <description>&lt;p&gt;First of all, a brief overview of our use case. Let's say I have a spreadsheet on Google Sheets which is not public and I want to be able to read/modify programmatically through some batch process running on my local machine or some server. This is something I had to do recently with a Node.js application and I found the authentication part a bit tricky to understand. So I thought of sharing my solution and I hope it helps someone in need. There might be better ways of doing this but I am sharing what worked best for me.&lt;/p&gt;

&lt;p&gt;Since there is no user interaction involved in our use case, we don't want to use the OAuth process where user needs to open a browser and sign in to their Google account to authorize the application. For scenarios like this, Google has a concept of &lt;em&gt;service account&lt;/em&gt;. A service account is a special type of Google account intended to represent a non-human user that needs to authenticate and be authorized to access data in Google APIs. Just like a normal account, a service account also has a email address (although it doesn't have an actual mailbox and you cannot send emails to a service account email). And just like you can share a google sheet with a user using their email address, you can share a google sheet with a service account as well using their email address. And this is exactly what we are going to do in this tutorial. We will create a spreadsheet on Google Sheets using a regular user, share it with a service account (that we will create) and use the credentials of the service account in our Node.js script to read and modify that sheet. &lt;/p&gt;

&lt;h3&gt;
  
  
  Pre-requisites
&lt;/h3&gt;

&lt;p&gt;This tutorial assumes that you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Experience working with Node.js&lt;/li&gt;
&lt;li&gt;A Google account&lt;/li&gt;
&lt;li&gt;A project setup on Google developers console where you have admin priveleges&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Steps Overview
&lt;/h3&gt;

&lt;p&gt;Here is the list of steps we will be following through this tutorial:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a spreadsheet on Google sheets&lt;/li&gt;
&lt;li&gt;Enable Google Sheets API in our project on Google developers console&lt;/li&gt;
&lt;li&gt;Create a service account &lt;/li&gt;
&lt;li&gt;Share the spreadsheet created in step 1 with the service account created in step 3&lt;/li&gt;
&lt;li&gt;Write a Node.js service to access the google sheets created in step 1 using the service account credentials&lt;/li&gt;
&lt;li&gt;Test our service written in step 5 &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now that we have an outline of what all we are going to do, let's get started&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Create a spreadsheet on Google Sheets
&lt;/h3&gt;

&lt;p&gt;This one doesn't really need any instructions. You just need to login to your google account, open Google Drive and create a new Google Sheet. You can put some random data in it. One thing that we need to take note of is the sheet's id. When you have the sheet open in your browser, the url will look something like this: &lt;code&gt;https://docs.google.com/spreadsheets/d/1-XXXXXXXXXXXXXXXXXXXSgGTwY/edit#gid=0&lt;/code&gt;. And in this url, &lt;code&gt;1-XXXXXXXXXXXXXXXXXXXSgGTwY&lt;/code&gt; is the spreadsheet's id and it will be different for each spreadsheet. Take a note of it because we will need this in our Node.js script to access this spreadsheet. For this tutorial, here is the data we have stored in our spreadsheet:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-13-19-40.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-13-19-40.png" alt="Screenshot-from-2019-09-21-13-19-40"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Enable Google Sheets API in our project on Google developers console
&lt;/h3&gt;

&lt;p&gt;We need to enable Google Sheets API for our project in order to be able to use it. This tutorial assumes that you already have a project in Google developers console so if you don't have one, you can create a new one very easily. Once you have the project on Google developers console, open project dashboard. There you should see a button &lt;em&gt;Enable APIs and Services&lt;/em&gt;. &lt;/p&gt;

&lt;p&gt;Click on it and search for Google sheets API using the search bar. Once you see it, click on it and then click on &lt;em&gt;Enable&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-02-33.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-02-33.png" alt="Screenshot-from-2019-09-21-12-02-33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Create a Service Account
&lt;/h3&gt;

&lt;p&gt;Once you enable Google Sheets API in your project, you will see the page where you can configure the settings for this API. Click on &lt;em&gt;Credentials&lt;/em&gt; tab on the left sidebar. Here you will see a list of OAuth client IDs and service accounts. By default there should be none. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-03-16.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-03-16.png" alt="Screenshot-from-2019-09-21-12-03-16"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on &lt;em&gt;Create Credentials&lt;/em&gt; button at the top and select &lt;em&gt;Service Account&lt;/em&gt; option&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-03-26.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-03-26.png" alt="Screenshot-from-2019-09-21-12-03-26"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enter the name and description of the service account and click &lt;em&gt;Create&lt;/em&gt; button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-01.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-01.png" alt="Screenshot-from-2019-09-21-12-04-01"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;em&gt;Continue&lt;/em&gt; on the next dialog&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-15.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-15.png" alt="Screenshot-from-2019-09-21-12-04-15"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the next dialog, you get an option to create a key. This is an important step. Click on the &lt;em&gt;Create Key&lt;/em&gt; button and choose &lt;em&gt;JSON&lt;/em&gt; as the format. This will ask you to download the JSON file to your local machine. &lt;/p&gt;

&lt;p&gt;For this tutorial, I have renamed the file and saved it as &lt;code&gt;service_account_credentials.json&lt;/code&gt; on my local machine.&lt;/p&gt;

&lt;p&gt;Keep it somewhere safe. This key file contains the credentials of the service account that we need in our Node.js script to access our spreadsheet from Google Sheets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-56.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-04-56.png" alt="Screenshot-from-2019-09-21-12-04-56"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you've followed all of these steps, you should see the newly created service account on the credentials page&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-05-42.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-12-05-42.png" alt="Screenshot-from-2019-09-21-12-05-42"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Take a note of the email address of the service account. We will need to share our spreadsheet with this account.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Share the spreadsheet created in step 1 with the service account created in step 3
&lt;/h3&gt;

&lt;p&gt;Now that we have a service account, we need to share our spreadsheet with it. It's just like sharing a spreadsheet with any normal user account. Open the spreadsheet in your browser and click on the &lt;em&gt;Share&lt;/em&gt; button on top right corner. That will open a modal where you need to enter the email address of the service account. Uncheck the checkbox for &lt;em&gt;Notify people&lt;/em&gt; since this will send an email and since service account does not have any mailbox, it will give you a mail delivery failure notification. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-13-49-42.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-13-49-42.png" alt="Screenshot-from-2019-09-21-13-49-42"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;em&gt;OK&lt;/em&gt; button to share the spreadsheet with the service account.&lt;/p&gt;

&lt;p&gt;This completes all the configuration steps. Now we can get to the fun part :-)&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Write a Node.js service to access the google sheet using the service account credentials
&lt;/h3&gt;

&lt;p&gt;We will create our script as a service that can be used as a part of a bigger project. We will call it &lt;code&gt;googleSheetsService.js&lt;/code&gt;. It will expose following APIs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;getAuthToken&lt;/li&gt;
&lt;li&gt;getSpreadSheet&lt;/li&gt;
&lt;li&gt;getSpreadSheetValues&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The function &lt;code&gt;getAuthToken&lt;/code&gt; is where we will handle the authentication and it will return a token. Then we will be using that token and pass it on to other methods.&lt;/p&gt;

&lt;p&gt;We will not be covering writing data to the spreadsheet but once you get the basic idea of how to use the API, it will be easy to extend the service to add more and more functions supported by the Google Sheets API. &lt;/p&gt;

&lt;p&gt;We will be using the &lt;code&gt;googleapis&lt;/code&gt; npm module. So, let's get started by creating a directory for this demo project. Let's call it &lt;code&gt;google-sheets-demo&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd $HOME
mkdir google-sheets-demo
cd google-sheets-demo
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy the &lt;code&gt;service_account_credentials.json&lt;/code&gt; file that we created in step 3 to this directory (&lt;code&gt;google-sheets-demo&lt;/code&gt;). And create our new file &lt;code&gt;googleSheetsService.js&lt;/code&gt;. Paste the following lines to the file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// googleSheetsService.js

const { google } = require('googleapis')

const SCOPES = ['https://www.googleapis.com/auth/spreadsheets']

async function getAuthToken() {
  const auth = new google.auth.GoogleAuth({
    scopes: SCOPES
  });
  const authToken = await auth.getClient();
  return authToken;
}

module.exports = {
  getAuthToken,
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For now our service has only one function that returns the auth token. We will add another function &lt;code&gt;getSpreadSheet&lt;/code&gt; soon. First let us see what our function does. &lt;/p&gt;

&lt;p&gt;First, we require the &lt;code&gt;googleapis&lt;/code&gt; npm module. Then we define &lt;code&gt;SCOPES&lt;/code&gt;. When we create an auth token using google APIs, there is a concept of scopes which determines the level of access our client has. For reading and editing spreadsheets, we need access to the scope &lt;code&gt;https://www.googleapis.com/auth/spreadsheets&lt;/code&gt;. Similarly, if we only had to give readonly access to spreadsheets, we would have used scope &lt;code&gt;https://www.googleapis.com/auth/spreadsheets.readonly&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Inside the &lt;code&gt;getAuthToken&lt;/code&gt; function, we are calling the constructor &lt;code&gt;new google.auth.GoogleAuth&lt;/code&gt; passing in the scopes in the arguments object. &lt;/p&gt;

&lt;p&gt;This function expects two environment variables to be available, &lt;code&gt;GCLOUD_PROJECT&lt;/code&gt; which is the project ID of your Google developer console project and &lt;code&gt;GOOGLE_APPLICATION_CREDENTIALS&lt;/code&gt; which denotes the path of the file containing the credentials of the service account.&lt;/p&gt;

&lt;p&gt;We will need to set these environment variables from the command line. To get the project ID, you can get it from the url of the project when you open it in your web browser. It should look like this&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://console.cloud.google.com/home/dashboard?project=%7Bproject" rel="noopener noreferrer"&gt;https://console.cloud.google.com/home/dashboard?project={project&lt;/a&gt; ID}&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;And &lt;code&gt;GOOGLE_APPLICATION_CREDENTIALS&lt;/code&gt; must contain the path of the &lt;code&gt;service_account_credentials.json&lt;/code&gt; file. So, go to the terminal and from the &lt;code&gt;google-sheets-demo&lt;/code&gt; directory, run the following commands to set these environment variables:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export GCLOUD_PROJECT={project ID of your google project}
export GOOGLE_APPLICATION_CREDENTIALS=./service_account_credentials.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You need to make sure that you have the credentials file copied in the current directory. &lt;/p&gt;

&lt;p&gt;Now we will add two more functions to our service:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;getSpreadSheet&lt;/li&gt;
&lt;li&gt;getSpreadSheetValues&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The first one will return metadata about the spreadsheet while the second one will return the data inside the spreadsheet. Our modified &lt;code&gt;googleSheetsService.js&lt;/code&gt; file should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// googleSheetsService.js

const { google } = require('googleapis');
const sheets = google.sheets('v4');

const SCOPES = ['https://www.googleapis.com/auth/spreadsheets'];

async function getAuthToken() {
  const auth = new google.auth.GoogleAuth({
    scopes: SCOPES
  });
  const authToken = await auth.getClient();
  return authToken;
}

async function getSpreadSheet({spreadsheetId, auth}) {
  const res = await sheets.spreadsheets.get({
    spreadsheetId,
    auth,
  });
  return res;
}

async function getSpreadSheetValues({spreadsheetId, auth, sheetName}) {
  const res = await sheets.spreadsheets.values.get({
    spreadsheetId,
    auth,
    range: sheetName
  });
  return res;
}


module.exports = {
  getAuthToken,
  getSpreadSheet,
  getSpreadSheetValues
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At the top we have added a line&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const sheets = google.sheets('v4');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is to use the sheets API. Then we have added the two new functions &lt;code&gt;getSpreadSheet&lt;/code&gt; and &lt;code&gt;getSpreadSheetValues&lt;/code&gt;. To see all the supported API endpoints for Google Sheets API, check this link &lt;a href="https://developers.google.com/sheets/api/reference/rest" rel="noopener noreferrer"&gt;https://developers.google.com/sheets/api/reference/rest&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;For our demo, we are only using two of those. The &lt;code&gt;getSpreadSheet&lt;/code&gt; function expects &lt;code&gt;auth&lt;/code&gt; token and the &lt;code&gt;spreadsheetId&lt;/code&gt; as its parameters. And the &lt;code&gt;getSpreadSheetValues&lt;/code&gt; expects one additional parameter that is the &lt;code&gt;sheetName&lt;/code&gt; from which to fetch the data. By default, a spreadsheet only contains a single sheet and it is named as &lt;code&gt;Sheet1&lt;/code&gt;. Finally we export the newly added functions via &lt;code&gt;module.exports&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;This completes our &lt;code&gt;googleSheetsService&lt;/code&gt;. If you need to support more API functions, you can check the reference using the link above, add the corresponding wrapper functions in this service and export it using &lt;code&gt;module.exports&lt;/code&gt;. For any consumer of this service, they will first need to call the &lt;code&gt;getAuthToken&lt;/code&gt; function to get the auth token and then pass on that token to the subsequent functions like &lt;code&gt;getSpreadSheet&lt;/code&gt;, &lt;code&gt;getSpreadSheetValues&lt;/code&gt;, etc. Now that we have our service ready, we just need to test it to make sure it is working fine&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 6: Test our service
&lt;/h3&gt;

&lt;p&gt;So we have our service ready. But does it work? Let's check that out.&lt;/p&gt;

&lt;p&gt;While typically, we would use a testing framework to run unit tests, to keep this tutorial simple, we are going to write a simple Node.js script. From our project's directory, create a new file called &lt;code&gt;test.js&lt;/code&gt; and copy paste the   following contents:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const {
  getAuthToken,
  getSpreadSheet,
  getSpreadSheetValues
} = require('./googleSheetsService.js');

const spreadsheetId = process.argv[2];
const sheetName = process.argv[3];

async function testGetSpreadSheet() {
  try {
    const auth = await getAuthToken();
    const response = await getSpreadSheet({
      spreadsheetId,
      auth
    })
    console.log('output for getSpreadSheet', JSON.stringify(response.data, null, 2));
  } catch(error) {
    console.log(error.message, error.stack);
  }
}

async function testGetSpreadSheetValues() {
  try {
    const auth = await getAuthToken();
    const response = await getSpreadSheetValues({
      spreadsheetId,
      sheetName,
      auth
    })
    console.log('output for getSpreadSheetValues', JSON.stringify(response.data, null, 2));
  } catch(error) {
    console.log(error.message, error.stack);
  }
}

function main() {
  testGetSpreadSheet();
  testGetSpreadSheetValues();
}

main()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This file contains two test functions and a &lt;code&gt;main&lt;/code&gt; function that is calling those test functions. At the bottom of the file, we are executing the &lt;code&gt;main&lt;/code&gt; function. This script expects two command line arguments:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;spreadsheetId (this is the ID that we got from step 1) &lt;/li&gt;
&lt;li&gt;sheetName (this is the name of the worksheet for which you want to see the values. When you create a new spreadsheet, it is &lt;code&gt;Sheet1&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-20-16-13.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fcodingfundas.com%2Fcontent%2Fimages%2F2019%2F09%2FScreenshot-from-2019-09-21-20-16-13.png" alt="Screenshot-from-2019-09-21-20-16-13"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, ensure that the env variables &lt;code&gt;GCLOUD_PROJECT&lt;/code&gt; and &lt;code&gt;GOOGLE_APPLICATION_CREDENTIALS&lt;/code&gt; are set properly. &lt;/p&gt;

&lt;p&gt;Now, from the terminal, run this script&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;node test.js &amp;lt;your google sheet's spreadsheet id&amp;gt; &amp;lt;sheet name of the worksheet&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you have followed all the steps correctly, you should see output like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;output for getSpreadSheet {
  "spreadsheetId": "1-jG5jSgGTwXXXXXXXXXXXXXXXXXXY",
  "properties": {
    "title": "test-sheet",
    "locale": "en_US",
    "autoRecalc": "ON_CHANGE",
    "timeZone": "Asia/Calcutta",
    "defaultFormat": {
      "backgroundColor": {
        "red": 1,
        "green": 1,
        "blue": 1
      },
      "padding": {
        "top": 2,
        "right": 3,
        "bottom": 2,
        "left": 3
      },
      "verticalAlignment": "BOTTOM",
      "wrapStrategy": "OVERFLOW_CELL",
      "textFormat": {
        "foregroundColor": {},
        "fontFamily": "arial,sans,sans-serif",
        "fontSize": 10,
        "bold": false,
        "italic": false,
        "strikethrough": false,
        "underline": false
      }
    }
  },
  "sheets": [
    {
      "properties": {
        "sheetId": 0,
        "title": "Sheet1",
        "index": 0,
        "sheetType": "GRID",
        "gridProperties": {
          "rowCount": 1000,
          "columnCount": 26
        }
      }
    }
  ],
  "spreadsheetUrl": "https://docs.google.com/spreadsheets/d/1-jG5jSgGTwXXXXXXXXXXXXXXXXXXY/edit"
}
output for getSpreadSheetValues {
  "range": "Sheet1!A1:Z1000",
  "majorDimension": "ROWS",
  "values": [
    [
      "Name",
      "Country",
      "Age"
    ],
    [
      "John",
      "England",
      "30"
    ],
    [
      "Jane",
      "Scotland",
      "23"
    ],
    [
      "Bob",
      "USA",
      "45"
    ],
    [
      "Alice",
      "India",
      "33"
    ]
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you get an error, it means you have not followed all the steps correctly. For this tutorial, the version of &lt;code&gt;googleapis&lt;/code&gt; npm module was &lt;code&gt;43.0.0&lt;/code&gt;. You might face issues if you are using older version of the module. Make sure the spreadsheetId and sheetname are correct and the enviroment variables are set properly. If you still get error, you should check the error message and code to see what might be causing the problem. &lt;/p&gt;

&lt;h3&gt;
  
  
  References
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/googleapis/google-api-nodejs-client" rel="noopener noreferrer"&gt;Documentation for Google API Node.js client&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/sheets/api/reference/rest" rel="noopener noreferrer"&gt;Official Google Sheets API reference&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I would definitely recommend checking out these references (especially the  Official Google Sheets API reference) to get a more in depth understanding of the sheets API and how to use the Node.js client. &lt;/p&gt;

&lt;p&gt;Hope you found this tutorial helpful. Thanks and happy coding :-)&lt;/p&gt;

</description>
      <category>googleapi</category>
      <category>node</category>
      <category>googlesheets</category>
    </item>
  </channel>
</rss>
