<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Yash Lotan</title>
    <description>The latest articles on DEV Community by Yash Lotan (@yash0212).</description>
    <link>https://dev.to/yash0212</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/yash0212"/>
    <language>en</language>
    <item>
      <title>Bitbucket Pipeline deployment with AWS ECR on EC2</title>
      <dc:creator>Yash Lotan</dc:creator>
      <pubDate>Sun, 19 Apr 2020 12:19:34 +0000</pubDate>
      <link>https://dev.to/yash0212/bitbucket-pipeline-deployment-with-aws-ecr-on-ec2-3k13</link>
      <guid>https://dev.to/yash0212/bitbucket-pipeline-deployment-with-aws-ecr-on-ec2-3k13</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This is my very first tutorial on any of the technical blogging platform. In this tutorial, we'll be configuring a custom pipeline that builds the docker image from Dockerfile present in the root of the bitbucket repository, push it to Amazon ECR and execute a script on EC2 server which will pull the image and start a container on the server.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;An AWS account&lt;/li&gt;
&lt;li&gt;An ECR repository - I'll name it demo-repo&lt;/li&gt;
&lt;li&gt;An EC2 instance with docker setup&lt;/li&gt;
&lt;li&gt;Keyfile to setup communicate to ec2 via ssh&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;awscli&lt;/code&gt; setup and configured on ec2 instance&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 1
&lt;/h3&gt;

&lt;p&gt;In this step, we'll see how to enable pipeline for a repository&lt;/p&gt;

&lt;p&gt;Enable the pipelines for the bitbucket repository in which you have to use the pipeline. You need to have the administrator access to enable pipeline support for a repository in bitbucket. Once you have the admin access then go to Settings &amp;gt; Pipeline &amp;gt; Settings. Here only one option will be available and that will of Enable Pipelines, turn the pipeline on from the slider button&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbgxb6wu9galeow80pubf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbgxb6wu9galeow80pubf.png" alt="Pipeline settings enable button"&gt;&lt;/a&gt;&lt;br&gt;
Click on Configure/View bitbucket-pipelines.yml button and continue to the next step.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2
&lt;/h3&gt;

&lt;p&gt;In this step, we'll select a language template for our pipeline.&lt;/p&gt;

&lt;p&gt;You'll be now on pipeline configuration page, scroll down to the bottom where you see a bunch of language templates to get started with.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8109xij4wweysg3qt0xq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8109xij4wweysg3qt0xq.png" alt="Choose language template"&gt;&lt;/a&gt;&lt;br&gt;
You can select one of the available language templates that suites your needs or if you want any of the predefined language templates then select any one of the language template and edit it according to your needs. Click on the Commit file button on the bottom right once you have selected the language template. I will select Javascript as my project is on node.&lt;/p&gt;

&lt;p&gt;On committing the file, you'll see that a &lt;code&gt;bitbucket-pipelines.yml&lt;/code&gt; file is now present in your repository's main branch, you can now merge this branch to your branch and continue editing this file in your branch or continue editing the file in main branch itself but it is good practice to not change files directly in the main branch.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3
&lt;/h3&gt;

&lt;p&gt;In this step, we'll configure bitbucket-pipelines.yml file&lt;/p&gt;

&lt;p&gt;The basic setup of the pipeline is done, now we'll see how to create a custom pipeline and create a step to deploy our repository.&lt;br&gt;
Note: bitbucket-pipelines.yml is a YAML file so it has a format to follow, just take care of indentation, in some places a 2-space indent is required and in some cases 4-space indent is required.&lt;/p&gt;

&lt;p&gt;Let's edit the bitbucket-pipelines.yml file&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

image: node:10.15.3

pipelines:
  custom:
    build-and-deploy-to-dev:
      - step:
          deployment: Test
          script:
            - docker build ./ -t $AWS_ECR_REPOSITORY --build-arg ENV=dev


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code, I have created a custom pipeline called &lt;code&gt;build-and-deploy-to-dev&lt;/code&gt;, also the deployment 'Test' is applied to this step. This step builds a docker image from Dockerfile present in the root of the repository. I have used &lt;code&gt;$AWS_ECR_REPOSITORY&lt;/code&gt;, it is a deployment variable. You can have Repository variables and Deployment variables in the bitbucket pipeline, check references for more information on variables. I have 6 deployment variables setup for my pipeline &lt;code&gt;AWS_ECR_REPOSITORY&lt;/code&gt;, &lt;code&gt;SERVER_IP&lt;/code&gt;, &lt;code&gt;SSH_KEY&lt;/code&gt;, &lt;code&gt;AWS_DEFAULT_REGION&lt;/code&gt;, &lt;code&gt;AWS_ACCESS_KEY_ID&lt;/code&gt;, &lt;code&gt;AWS_SECRET_ACCESS_KEY&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5zvv8vl81nbcwsjunv66.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5zvv8vl81nbcwsjunv66.png" alt="Deployment Variables"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have kept &lt;code&gt;SSH_KEY&lt;/code&gt; as a secured variable and I will show you how to set up the value of this variable in upcoming steps.&lt;br&gt;
These 6 variables are defined as deployment variables as their values differ for each deployment, different for test, staging, and production.&lt;br&gt;
The values for &lt;code&gt;AWS_DEFAULT_REGION&lt;/code&gt;, &lt;code&gt;AWS_ACCESS_KEY_ID&lt;/code&gt;, and &lt;code&gt;AWS_SECRET_ACCESS_KEY&lt;/code&gt; can be same as of IAM user used for configuring &lt;code&gt;awscli&lt;/code&gt; on ec2 instance.&lt;/p&gt;

&lt;p&gt;Getting back to the code, every step should have a script tag and you can see the indentation gap between each line is 2space but there is 4space indent between &lt;code&gt;- step&lt;/code&gt; and &lt;code&gt;deployment: Test&lt;/code&gt; line, it is the YAML format and we have to follow it otherwise the pipeline will not execute.&lt;br&gt;
Now you can run this pipeline, it will create a docker image with a tag demo-repo in my case.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4
&lt;/h2&gt;

&lt;p&gt;Adding pipes to the custom pipeline for additional functionality&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

image: node:10.15.3

pipelines:
  custom:
    build-and-deploy-to-dev:
      - step:
          deployment: Test
          script:
            - docker build ./ -t $AWS_ECR_REPOSITORY --build-arg ENV=dev
            - pipe: "atlassian/aws-ecr-push-image:1.1.0"
              variables:
                  AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                  AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                  AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                  IMAGE_NAME: $AWS_ECR_REPOSITORY
            - pipe: "atlassian/ssh-run:0.2.4"
              variables:
                  SSH_USER: ec2-user
                  SERVER: $SERVER_IP
                  SSH_KEY: $SSH_KEY
                  MODE: script
                  COMMAND: demo-repo-container-script.sh


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I have added 2 pipes to the custom pipeline, pipes can be viewed as miny pipelines used to ease the pipeline development.&lt;br&gt;
The first pipe is used to push the built docker image from Step #3 to the AWS ECR repository and the second pipe is used to run the &lt;code&gt;demo-repo-container-script.sh&lt;/code&gt; script present in the root of the repository on the ec2 instance via ssh.&lt;/p&gt;

&lt;p&gt;Note: There should be only 2space indent between &lt;code&gt;- pipe:&lt;/code&gt; line and &lt;code&gt;variables:&lt;/code&gt; line&lt;/p&gt;

&lt;p&gt;The links to the pipes are attached in the references&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting up &lt;code&gt;SSH_KEY&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;I read the documentation of the ssh-run pipe and searched over the internet but was unable to figure out the correct format of ssh that is to be used as the secure variable, so I'll show you how it is done.&lt;/p&gt;

&lt;p&gt;First of all, you need the .pem file that you use to connect to ec2 instance via ssh and a Linux machine. I managed to do this on Linux only so I can help with Linux only, feel free to provide methods on other OS in comments. I have generated &lt;code&gt;demo-repo-kp.pem&lt;/code&gt; from ec2's key-pair generation feature.&lt;/p&gt;

&lt;p&gt;This is what demo-repo-kp.pem file contents looks like(don't worry I'll delete this key-pair file when this article goes live)&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0lvb1yiew1cxrwe5zvyz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0lvb1yiew1cxrwe5zvyz.png" alt="demo-repo-kp.rem files"&gt;&lt;/a&gt;&lt;br&gt;
As you can see this is an RSA file and the &lt;code&gt;SSH_KEY&lt;/code&gt; needs base64 encoded private key as required by the &lt;code&gt;ssh-run&lt;/code&gt; pipe.&lt;br&gt;
To convert this pem file to private key, use this Linux command&lt;br&gt;
&lt;code&gt;$ openssl pkcs8 -in demo-repo-kp.pem -topk8 -nocrypt -out pv-key.pem&lt;/code&gt;&lt;br&gt;
Replace &lt;code&gt;demo-repo-kp.pem&lt;/code&gt; with your input key file&lt;br&gt;
Now the &lt;code&gt;pv-key.pem&lt;/code&gt; contents look like&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fn9asfpep48mqgef6byls.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fn9asfpep48mqgef6byls.png" alt="Private key contents"&gt;&lt;/a&gt;&lt;br&gt;
Now that you have the private key, you can use any &lt;a href="https://www.base64encode.org/" rel="noopener noreferrer"&gt;online tool&lt;/a&gt; to convert the private key file to base64 encoded string which looks like random characters ending with a &lt;code&gt;=&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;Note: Don't forget to include the first line containing &lt;code&gt;-----BEGIN PRIVATE KEY-----&lt;/code&gt; and the last line containing &lt;code&gt;-----END PRIVATE KEY-----&lt;/code&gt; while encoding the file in base64.&lt;/p&gt;

&lt;p&gt;Now you can use base64 encoded string in the &lt;code&gt;SSH_KEY&lt;/code&gt; and don't forget to enable the secured checkbox.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuring demo-repo-container-script.sh
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7drkmqb6bbn1xqgm279e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7drkmqb6bbn1xqgm279e.png" alt="Container Script"&gt;&lt;/a&gt;&lt;br&gt;
The script is well commented if you have any queries feel free to ask in the comments&lt;/p&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://confluence.atlassian.com/bitbucket/variables-in-pipelines-794502608.html" rel="noopener noreferrer"&gt;Variables in pipelines&lt;/a&gt;&lt;br&gt;
&lt;a href="https://confluence.atlassian.com/bitbucket/use-ssh-keys-in-bitbucket-pipelines-847452940.html" rel="noopener noreferrer"&gt;Use SSH keys in Bitbucket Pipelines&lt;/a&gt;&lt;br&gt;
&lt;a href="https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html" rel="noopener noreferrer"&gt;Caching dependencies&lt;/a&gt;&lt;br&gt;
&lt;a href="https://bitbucket.org/atlassian/ssh-run/src/0.2.5/" rel="noopener noreferrer"&gt;ssh-run pipe&lt;/a&gt;&lt;br&gt;
&lt;a href="https://bitbucket.org/atlassian/aws-ecr-push-image/src/1.1.2/" rel="noopener noreferrer"&gt;aws-ecr-push-image pipe&lt;/a&gt;&lt;/p&gt;

</description>
      <category>bitbucket</category>
      <category>pipelines</category>
    </item>
  </channel>
</rss>
