<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Hari Pranav A</title>
    <description>The latest articles on DEV Community by Hari Pranav A (@haripranav).</description>
    <link>https://dev.to/haripranav</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/haripranav"/>
    <language>en</language>
    <item>
      <title>A DevOps approach to building, Scalable Cloud-Native Data Engineering Applications</title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Tue, 24 May 2022 11:20:02 +0000</pubDate>
      <link>https://dev.to/haripranav/a-cicd-approach-to-building-scalable-cloud-native-data-engineering-applications-2f9a</link>
      <guid>https://dev.to/haripranav/a-cicd-approach-to-building-scalable-cloud-native-data-engineering-applications-2f9a</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IUmUMy5I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ml5ckx225cgcw0a1g1wt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IUmUMy5I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ml5ckx225cgcw0a1g1wt.png" alt="Image description" width="880" height="700"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the last blog &lt;a href="https://medium.com/@haripranav98/building-data-engineering-pipelines-on-aws-5329f3120e77"&gt;link&lt;/a&gt; we created a Flask application on an AWS EC2 instance and built custom data pipelines to interact with Data Engineering tools on AWS. This method of deployment has its own pitfalls, as the entire software development lifecycle which involves the maintenance of code, development of new features, collaboration and versioning is often difficult to handle.&lt;br&gt;
Hence we need to shift to a DevOps approach which helps developers create end to end pipelines from development to testing to production which can deal with collaborative changes.&lt;/p&gt;

&lt;p&gt;In this blog post we will be.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Creating a Docker image of the Flask Application&lt;/li&gt;
&lt;li&gt;Publishing the image to AWS Container Registry and Running the code in AWS Container Service using AWS Fargate&lt;/li&gt;
&lt;li&gt;Add a Load Balancer for the ECS deployment&lt;/li&gt;
&lt;li&gt;Create an AWS Code Commit repository to push code&lt;/li&gt;
&lt;li&gt;Configure AWS Code Build to build new changes from the Code Commit Repository&lt;/li&gt;
&lt;li&gt;Configure Code Pipeline to automatically run steps 1 to 5 once the new commit is made to the Code Commit Repository&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Creating a Docker image of the Flask Application:
&lt;/h1&gt;

&lt;p&gt;We can create a simple flask application which can interact with AWS as shown in the blog post below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@haripranav98/building-data-engineering-pipelines-on-aws-5329f3120e77"&gt;Using Flask to create custom data pipelines&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we are successful with the creation of the Flask application on Ec2 instance we then need to add a Dockerfile which is used to create a Docker image from the existing Flask application.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create two new files called Dockerfile and requirements.txt in the same parent directory as shown below&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PUn2oHGk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158371-56af212b-5d56-4498-b833-de7358287de1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PUn2oHGk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158371-56af212b-5d56-4498-b833-de7358287de1.png" alt="image" width="255" height="334"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Create the requirements.txt file which has all the packages to be installed on the Docker Image.&lt;/p&gt;

&lt;p&gt;$ cd CLOUDAUDIT_DATA_WRANGLER&lt;/p&gt;

&lt;p&gt;$ nano requirements.txt&lt;/p&gt;

&lt;p&gt;Then Add the two package inside the file&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Flask
awswrangler
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here we don't a specify the version and this ensures that the latest version of the packages get installed.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Add the following lines into the Dockerfile.&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# syntax=docker/dockerfile:1

FROM python:3.8-slim-buster

WORKDIR /python-docker

COPY requirements.txt requirements.txt

RUN pip3 install -r requirements.txt

COPY . .

CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The first line (# syntax=docker/dockerfile:1) tells what is the syntax to be used while parsing the Dockerfile and the location of the Docker Syntax file&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The second line (FROM python:3.8-slim-buster) allows us to use an already existing base image for Docker.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The third line (WORKDIR /python-docker) tells docker which directory to use.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The fourth and fifth lines tells Docker to copy the contents of requirements.txt into a container image's requirements.txt file and then run the pip install command for all the packages and dependencies.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The sixth line (COPY . .) tells Docker to copy the remainder of the folders to be copied into Dockers container.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The last line with (-m) indicates to run the Flask app as a module and to make the container accessible to the browser.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Building the Docker Image:
&lt;/h2&gt;

&lt;p&gt;When we run the command as shown below, it builds a container with the name &lt;strong&gt;python-docker&lt;/strong&gt;. This can then be pushed to our Container Registry on AWS.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ docker build --tag python-docker .

$ docker run -d -p 5000:5000 python-docker
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Here "-d" will run it in detached mode and "-p" will expose the specific port.&lt;/p&gt;

&lt;h1&gt;
  
  
  Publishing the image to AWS Container Registry and Running the code in AWS Container Service using AWS Fargate
&lt;/h1&gt;

&lt;p&gt;We will be using AWS Fargate which is a serverless compute engine for Amazon ECS that runs containers without the headache of managing the infrastructure.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the AWS console and then search for ECR&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Click on &lt;strong&gt;Get Started&lt;/strong&gt; and then &lt;strong&gt;Create Repository&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O0Bn3AIF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156413-75d5d1f9-d79d-4051-99dd-9c73cd20601d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O0Bn3AIF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156413-75d5d1f9-d79d-4051-99dd-9c73cd20601d.png" alt="image" width="880" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Szu1mMku--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156597-46d3d06d-dbb2-49f0-9a55-8c53838feebe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Szu1mMku--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156597-46d3d06d-dbb2-49f0-9a55-8c53838feebe.png" alt="image" width="880" height="619"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here the repository name is flask-app&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now click on push commands as shown in the screenshot shown below.&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U2EsVqXV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156715-39018f0b-65ec-41f1-ade9-1cd09cf1cc22.png" alt="image" width="852" height="689"&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We have a list of commands that we need to run on our Ec2 instance which will push the docker image to ECR.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5Eve_3wj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156818-327d426d-a690-49a5-93cf-bd27c0af5907.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5Eve_3wj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166156818-327d426d-a690-49a5-93cf-bd27c0af5907.png" alt="image" width="880" height="682"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For &lt;strong&gt;Command 1&lt;/strong&gt; we need to attach a policy for the Ec2 instance to get the necessary permissions to authenticate with the ECR.&lt;/p&gt;

&lt;p&gt;The links below show how we can attach a policy to the Ec2 machine.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html#attach-iam-role"&gt;Attach IAM to Ec2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AmazonECR/latest/userguide/security_iam_id-based-policy-examples.html"&gt;Add ECR execution Policy&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After we add the permissions we need not run the second command as we have already created the docker image.&lt;/p&gt;

&lt;p&gt;Now for &lt;strong&gt;command 3 and 4&lt;/strong&gt; we need to run :&lt;/p&gt;

&lt;p&gt;$ docker tag python-docker:xxxxxxxxxx/flask-app:latest&lt;/p&gt;

&lt;p&gt;&lt;em&gt;REPLACE THE XXXXXXXX with the statements from your Console&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;$docker push xxxxxxxxx/flask-app:latest&lt;/p&gt;

&lt;p&gt;&lt;em&gt;REPLACE THE XXXXXXXX with the statements from your Console&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here we are tagging the image &lt;strong&gt;python-docker&lt;/strong&gt; and then pushing the image to ECR with the name as flask-app.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Creating a cluster&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Search for ECS on the AWS Console, Click on &lt;strong&gt;create a new cluster&lt;/strong&gt;, give a name and choose the default &lt;strong&gt;VPC and the subnets&lt;/strong&gt;. Then choose &lt;strong&gt;AWS Fargate&lt;/strong&gt; as shown in the image below.&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2DE-iqxe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157117-5e51b5cc-658e-48ae-b807-cca2b991bf76.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2DE-iqxe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157117-5e51b5cc-658e-48ae-b807-cca2b991bf76.png" alt="image" width="837" height="750"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Then the left pane choose &lt;strong&gt;Task Definition&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SOl4eLfZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157309-f4608340-27b5-491d-83b7-40462fec9ca8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SOl4eLfZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157309-f4608340-27b5-491d-83b7-40462fec9ca8.png" alt="image" width="823" height="775"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this give a unique name, Enter &lt;strong&gt;flask-app&lt;/strong&gt; for the name of the container and the URI which can be found in the ECR screenshot as seen below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--36WQwvs7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157388-7fe33fb0-b5c4-4dc5-8b87-c573319e0f53.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--36WQwvs7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157388-7fe33fb0-b5c4-4dc5-8b87-c573319e0f53.png" alt="image" width="880" height="130"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the port mapping choose port &lt;strong&gt;5000 and 80&lt;/strong&gt;. Then click on next. For the &lt;strong&gt;Environment&lt;/strong&gt; choose Fargate and leave the other defaults as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8beihxUF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157459-5c17dd33-6cb8-49e1-85fd-86e3ccd4dfc7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8beihxUF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157459-5c17dd33-6cb8-49e1-85fd-86e3ccd4dfc7.png" alt="image" width="630" height="560"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally Review and click next. This will add the Container in ECR to Fargate.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Running the Task Definitions:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;As shown in the screenshot below click on on the created Task Definition and click on Run&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_1hY54pq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157555-b0f06a36-a3d7-44da-bb1a-60677034d0d0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_1hY54pq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157555-b0f06a36-a3d7-44da-bb1a-60677034d0d0.png" alt="image" width="880" height="122"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose the Existing Cluster as shown below&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HBSNPBx4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157616-053ee8b1-41b0-43ee-a609-adc2a4eb14a2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HBSNPBx4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157616-053ee8b1-41b0-43ee-a609-adc2a4eb14a2.png" alt="image" width="636" height="690"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose the VPC and Subnet, create a new security group and click on deploy as shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gEbtouzm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157671-5d6b25e7-858e-4c5a-b0eb-e61dfe521a77.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gEbtouzm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157671-5d6b25e7-858e-4c5a-b0eb-e61dfe521a77.png" alt="image" width="652" height="613"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate back to &lt;strong&gt;Clusters on the left&lt;/strong&gt; then click on the cluster which has been created. Then Click on &lt;strong&gt;Tasks&lt;/strong&gt; to get the ENI which has been shown in Black in the image below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--coQkrXZN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157869-0d152b0a-8a11-434a-8984-55d230790c7b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--coQkrXZN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157869-0d152b0a-8a11-434a-8984-55d230790c7b.png" alt="image" width="880" height="183"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on ENI and also make note of the public IP address as shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---hlUhRTh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157970-249023fa-900a-4432-a46b-7226e2dc4680.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---hlUhRTh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166157970-249023fa-900a-4432-a46b-7226e2dc4680.png" alt="image" width="880" height="105"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on security Group and Edit the inbound port rules to allow traffic from Port 5000 and 80 as shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0yjKnR4P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158122-7239a5d4-b1cd-41d9-8686-9117599a7b31.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0yjKnR4P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158122-7239a5d4-b1cd-41d9-8686-9117599a7b31.png" alt="image" width="880" height="127"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RV1KdHXM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158195-963be55c-d609-4f4d-8a45-8fcba4e083bf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RV1KdHXM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158195-963be55c-d609-4f4d-8a45-8fcba4e083bf.png" alt="image" width="880" height="99"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once this is done Enter the PUBLIC IP from the ENI and the port 5000&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Eg: &lt;strong&gt;&lt;a href="http://publicIP:5000"&gt;http://publicIP:5000&lt;/a&gt;&lt;/strong&gt; in the URL and once this is entered YOU HAVE SUCCESFULLY LAUNCHED A DOCKER IMAGE ON THE ELASTIC CONTAINER SERVICE !!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PvqVRe8D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158258-05d1488a-c735-43e9-8a61-e5188dcce687.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PvqVRe8D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166158258-05d1488a-c735-43e9-8a61-e5188dcce687.png" alt="image" width="880" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Add a Load Balancer for the ECS deployment:
&lt;/h1&gt;

&lt;p&gt;Open the EC2 Console and then choose the application load balancer as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q4dH0-Nb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/168069807-ca8183ac-031f-492a-b746-f3af741270b8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q4dH0-Nb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/168069807-ca8183ac-031f-492a-b746-f3af741270b8.png" alt="image" width="446" height="733"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure to shift to the OLD AWS CONSOLE while doing this step as it will help to follow with the same steps as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6iq5IP0N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/168070929-81d61586-7a0f-431f-9522-6b157a96f86b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6iq5IP0N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/168070929-81d61586-7a0f-431f-9522-6b157a96f86b.png" alt="image" width="880" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Choose &lt;strong&gt;Application Load Balancer&lt;/strong&gt; and click Create&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next, Configure it by giving a name and select the VPC and availability zones&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f5lolsKG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169034353-9ec32f8e-52f1-4b3c-91d9-82567a751e66.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f5lolsKG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169034353-9ec32f8e-52f1-4b3c-91d9-82567a751e66.png" alt="image" width="880" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Click Next, select Create a new security group and then click Next&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Give a name to Target group, for Target type select IP and then click Next&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uKfN7T2o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169034491-b28126b8-95f1-40b7-96bc-1e9de4574665.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uKfN7T2o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169034491-b28126b8-95f1-40b7-96bc-1e9de4574665.png" alt="image" width="880" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Click Next, Review it and click Create&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once created note down the DNS name, which is the public address for the service&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Creating a Fargate Service:
&lt;/h1&gt;

&lt;p&gt;We will use the same task definition to create a &lt;strong&gt;Fargate Service&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to Task Definitions in Amazon ECS, tick the radio button corresponding to the existing Task definition and click Actions and Create Service.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pvoS61OQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035450-c9666d41-07e5-46cf-b898-30bace419d02.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pvoS61OQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035450-c9666d41-07e5-46cf-b898-30bace419d02.png" alt="image" width="880" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose Fargate as launch type, give it a name, do not change the Deployment type (Rolling update), and click Next.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wUGZ5QqX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035677-85b0e741-fe30-4805-9c27-276f4942382f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wUGZ5QqX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035677-85b0e741-fe30-4805-9c27-276f4942382f.png" alt="image" width="550" height="797"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Choose the subnets that we have configured in the load balancer&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gzrLBGyI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035851-d2a6bae6-267f-47d6-9ecc-2e0ca71aa124.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gzrLBGyI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169035851-d2a6bae6-267f-47d6-9ecc-2e0ca71aa124.png" alt="image" width="880" height="310"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Choose Application load balancer for the load balancer type, and then click Add to load balancer&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vFaNVi6N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169036079-e6de31dd-f567-46a0-8b47-0f037c49840b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vFaNVi6N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169036079-e6de31dd-f567-46a0-8b47-0f037c49840b.png" alt="image" width="610" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the Target group name that we have created in the Application load balancer and then click Next, Review it and then click Create Service.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Now we need to configure the Application load balancer security group that we have created earlier, go to the created Application load balancer and click on security groups and then click Edit inbound rules and add a Custom TCP rule with port 5000, as this is the internal port our application is configured in the flask application&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now, we can check our application is running by visiting the load balancer DNS name in a browser&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It will most probably not run and there are very few guides to trouble shoot this issue. The Root cause for this is as follows:&lt;/p&gt;

&lt;p&gt;Flow of the application before adding the load balancer:&lt;/p&gt;

&lt;p&gt;Client -&amp;gt; URL -&amp;gt; Public IP of Fargate(port 5000 and 80 is opened to a specific IP) -&amp;gt; Response&lt;/p&gt;

&lt;p&gt;Once we add the new load balancer and open the ports&lt;/p&gt;

&lt;p&gt;Client -&amp;gt; URL -&amp;gt; Load Balancer -&amp;gt; Public IP of Fargate(Only port 5000 and 80 is opened to a specific IP) &lt;strong&gt;(PORT IS NOT OPENED TO THE LOAD BALANCER SECURITY GROUP)&lt;/strong&gt;-&amp;gt; Response (As port 5000 is opened for the Security Grp)&lt;/p&gt;

&lt;p&gt;Hence this can be seen in the documentation as shown below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5WEm0uFt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169037040-4ed651b1-712a-4797-bf66-153e42804fe6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5WEm0uFt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169037040-4ed651b1-712a-4797-bf66-153e42804fe6.png" alt="image" width="821" height="582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To trouble shoot this issue, we need to &lt;strong&gt;OPEN PORT&lt;/strong&gt; for the newly created &lt;strong&gt;Security Grp&lt;/strong&gt; which is attached to the load balancer.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open ECS and select your cluster which contains the newly created service as shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4D-B6Icj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038163-4f6b0078-275e-40fa-8c0c-8274a39ed336.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4D-B6Icj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038163-4f6b0078-275e-40fa-8c0c-8274a39ed336.png" alt="image" width="880" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on the Task and go to the ENI as shown below&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CVdlTHPO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038453-6e4157ba-d707-409d-aaec-5824520e1251.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CVdlTHPO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038453-6e4157ba-d707-409d-aaec-5824520e1251.png" alt="image" width="880" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Edit the Inbound rules and add the &lt;strong&gt;Security Grp&lt;/strong&gt; which was given to the load-balancer as shown below and also open port 5000&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--985AFVYe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038800-8e60281c-27c7-4e8d-b5cc-717e716e04b5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--985AFVYe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169038800-8e60281c-27c7-4e8d-b5cc-717e716e04b5.png" alt="image" width="880" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Now if we open the open the load balancer page and check the DNS address in our browser we should be getting the FLASK APPLICATION.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Hurray we have successfully deployed the application on ECS and added a load balancer !!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Another way to know if the application is running successfully is to check the load balancer &lt;strong&gt;Target GROUP&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the Ec2 console, navigate to load balancer and then check the TARGET GRP, if it is showing &lt;strong&gt;Unhealthy or Draining&lt;/strong&gt; refer to the links below, it will most probably be the Inbound port rules which have not been opened !. In the screenshot shown below, we can see that it shows that the instance is healthy&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FqSpAKtE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169111085-c0bb3ba7-56ea-4ccf-8a9d-ebe5a2792734.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FqSpAKtE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169111085-c0bb3ba7-56ea-4ccf-8a9d-ebe5a2792734.png" alt="image" width="880" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If we are still getting any 504, or 503 errors refer to the links below:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/premiumsupport/knowledge-center/public-load-balancer-private-ec2/"&gt;aws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/elasticloadbalancing/latest/application/load-balancer-update-security-groups.html"&gt;aws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://stackoverflow.com/questions/44403982/aws-load-balancer-ec2-health-check-request-timed-out-failure"&gt;stackoverflow&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Create an AWS Code Commit repository to push code:
&lt;/h1&gt;

&lt;p&gt;Open the AWS console and search for CodeCommit, click on create a new Repository and give it a name and a &lt;strong&gt;description&lt;/strong&gt; as shown in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Tq71FiJt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166207560-3cc886c1-d3af-47df-8900-e04e4092e0a3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tq71FiJt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166207560-3cc886c1-d3af-47df-8900-e04e4092e0a3.png" alt="image" width="880" height="601"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Add a new file called &lt;strong&gt;buildspec.yml&lt;/strong&gt; in the parent directory.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now our application folder structure will look as per the screenshot shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jiQg5c3b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166219011-2809bb37-52cd-401f-8c1a-c8cdacc5acbd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jiQg5c3b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166219011-2809bb37-52cd-401f-8c1a-c8cdacc5acbd.png" alt="image" width="880" height="248"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This file contains the commands which can be used to compile, test and package code.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  ADD A BUILD SPEC FILE and push this repo to git.
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uya7kUGd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169280328-7df73c84-3de3-45c7-b042-295379a40520.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uya7kUGd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169280328-7df73c84-3de3-45c7-b042-295379a40520.png" alt="image" width="880" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/dmoonat/sentiment-analysis/tree/master/containerized_webapp"&gt;Download and change according to the image above&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;We can see that the "image.json" is an artifact file which contains the container name and the image URI of the container pushed to ECR. We then use these in CodePipeline.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;artifacts are created from the container name which is created earlier in EC2 which is &lt;strong&gt;flask-app&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Since this file is a Yaml file we need to check spacing and make sure we follow the right number of tabs and spaces.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Once the repository is created we need to add specific permission and create Git credentials to access the CodeCommit repository.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to the IAM console, choose &lt;strong&gt;Users&lt;/strong&gt; and select which User you want to configure for CodeCommit, and attach &lt;strong&gt;AWSCodeCommitPowerUser&lt;/strong&gt; policy from the policies list and Review and then click Add Permission.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yvXAXVkO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166207923-4b408bb7-5710-4fe3-af0d-2073a60751d3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yvXAXVkO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166207923-4b408bb7-5710-4fe3-af0d-2073a60751d3.png" alt="image" width="880" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure Git using the blog post below:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up-gc.html?icmpid=docs_acc_console_connect_np"&gt;Configure Git Locally&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Follow the documentation below to Add Git Credentials for the AWS account. Once this is added, open the repository created in Code Commit and then run the following commands.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;CD into your repository and run the commands as shown&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add the files in your new local repository. This stages them for the first commit.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;$ git init&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Commit the files that you've staged in your local repository.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;$ git add .&lt;/p&gt;

&lt;p&gt;$ git commit -m "First commit"&lt;/p&gt;

&lt;p&gt;At the top of your GitHub repository's Quick Setup page, click to copy the remote repository URL.&lt;/p&gt;

&lt;p&gt;In the Command prompt, add the URL for the remote repository where your local repository will be pushed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Sets the new remote&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;$ git remote add origin &lt;/p&gt;

&lt;p&gt;Refer to the screenshot below to find the URL:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--J5Y1_YZy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169132983-5eaf5efc-176b-41bb-9d99-063bb4e00a38.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--J5Y1_YZy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169132983-5eaf5efc-176b-41bb-9d99-063bb4e00a38.png" alt="image" width="880" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Verifies the new remote URL&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;$ git remote -v&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Push the changes to the &lt;strong&gt;master&lt;/strong&gt; branch&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;$ git push origin master&lt;/p&gt;

&lt;h1&gt;
  
  
  Configure Code Pipeline to automatically run steps 1 to 5 once the new commit is made to the Code Commit Repository:
&lt;/h1&gt;

&lt;p&gt;We can then use Code Pipeline to configure builds from CodeCommit to the ECR which will inturn run the image on ECS.&lt;/p&gt;

&lt;p&gt;1.Go to &lt;strong&gt;CodePipeline&lt;/strong&gt; and click on &lt;strong&gt;GetStarted&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choose the &lt;strong&gt;Service&lt;/strong&gt;, &lt;strong&gt;Repo&lt;/strong&gt; and the Click on Next. Select the Repo name and Create a new service Role as shown below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DI6ARRA4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166220689-93dbea15-c080-40a8-949a-89659ede2a98.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DI6ARRA4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166220689-93dbea15-c080-40a8-949a-89659ede2a98.png" alt="image" width="880" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2.Choose the Source as &lt;strong&gt;AWS code Commit&lt;/strong&gt; and Repo name from the dropdown. Then Choose the &lt;strong&gt;Master branch&lt;/strong&gt; and click on Next&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3rpjX3WB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166220639-a5a874bd-2ac1-4872-8418-6b829b8e1620.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3rpjX3WB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/166220639-a5a874bd-2ac1-4872-8418-6b829b8e1620.png" alt="image" width="880" height="467"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3.Next, for the build provider select CodeBuild and create a new build project, give it a name and configure it as follows&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tKWkhasM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281262-62402be6-6322-4c16-893e-93538956204e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tKWkhasM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281262-62402be6-6322-4c16-893e-93538956204e.png" alt="image" width="587" height="622"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure to &lt;strong&gt;Check the box: Privileged&lt;/strong&gt; and make sure that the new service role is created and the Ec2 permission is added as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--S9tV9noe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281298-a2a82932-1d25-4c24-8cc1-b71ca7ad2608.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--S9tV9noe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281298-a2a82932-1d25-4c24-8cc1-b71ca7ad2608.png" alt="image" width="564" height="571"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4.It will create a New service role, to which we have to add ECRContainerBuilds permission. For that, open the IAM console in a new tab, go to Roles and search and select the above-created role, and click Attach policies. Search for ECR and select the policy as below and click Attach policy&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dajg31Ua--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281388-037ac079-c521-4970-926c-4e2b6709ee69.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dajg31Ua--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281388-037ac079-c521-4970-926c-4e2b6709ee69.png" alt="image" width="880" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For Deploy provider, select Amazon ECS, cluster, and service name. Also, add the name of the image definitions file as ‘images.json’ that we will create during the build process&lt;/p&gt;

&lt;p&gt;Here choose the name from the dropdown and ignore the names from the images below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8y-MT7vN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281469-ce35dbdc-42ef-43c6-b6b4-84406bc43ecb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8y-MT7vN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169281469-ce35dbdc-42ef-43c6-b6b4-84406bc43ecb.png" alt="image" width="734" height="541"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once all the configurations are you should have something like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W1iFUZSE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169282910-3e184823-be0b-448b-b7b8-328c094e9736.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W1iFUZSE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/169282910-3e184823-be0b-448b-b7b8-328c094e9736.png" alt="image" width="487" height="683"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can then make changes in the code push the commit to the repository and the pipeline will run and deploy the Latest changes to ECR via the DevOps Pipeline.&lt;/p&gt;

&lt;p&gt;If you do, You &lt;strong&gt;yyess, You&lt;/strong&gt; have implemented have implemented a cloud native DevOps scalable pipeline for Machine Learning and DataEngineering .&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don't forget to leave a Like, Share and Comment !!!!!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;References:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.analyticsvidhya.com/blog/2021/07/a-step-by-step-guide-to-create-a-ci-cd-pipeline-with-aws-services/"&gt;Analytics Vidya&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/sample-docker.html"&gt;AWS Docker&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/change-project-console.html"&gt;Code Build&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/codepipeline/latest/userguide/welcome.html"&gt;Code Pipeline&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devops</category>
      <category>aws</category>
      <category>datascience</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Building Data Engineering Pipelines on AWS</title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Mon, 21 Feb 2022 13:50:12 +0000</pubDate>
      <link>https://dev.to/haripranav/building-data-engineering-pipelines-on-aws-535k</link>
      <guid>https://dev.to/haripranav/building-data-engineering-pipelines-on-aws-535k</guid>
      <description>&lt;p&gt;Data Engineering involves building pipelines to produce data driven decisions from a variety of data sources as shown below:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Relational Databases&lt;/li&gt;
&lt;li&gt;Non Relational Databases&lt;/li&gt;
&lt;li&gt;Data Marts &lt;/li&gt;
&lt;li&gt;Streaming Sources .... etc.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In this blog we will create an end to end Data Pipeline from a Flask Web application which will make use of Big Data tools like Athena to query the data on AWS and finally visualize the same on the cloud in near real time. &lt;/p&gt;

&lt;p&gt;The architecture flow is as follows:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hQJjDfQm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/154731819-b06da20b-d574-4303-9f34-e963e2e86613.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hQJjDfQm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/154731819-b06da20b-d574-4303-9f34-e963e2e86613.png" alt="Architecture" width="825" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The flask application is hosted inside Ec2 and uses the data wrangler package to interact with AWS. A query is run in the backend which hits the Athena table.&lt;/li&gt;
&lt;li&gt;The data returned is stored a data frame in the flask application.&lt;/li&gt;
&lt;li&gt;The data frame is converted to a csv and pushed to S3 &lt;/li&gt;
&lt;li&gt;There is a glue crawler which runs every time a new file is pushed to s3. This is useful if we run queries multiple times which gives multiple CSV files for each fired query.
&lt;/li&gt;
&lt;li&gt;The Glue Catalog has the Database and the DDL for creating the table in Athena&lt;/li&gt;
&lt;li&gt;The table in Athena can be connected to Quicksight which is used to visualize the data.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Flask Application  on Ec2 interacting with AWS CLI :
&lt;/h2&gt;

&lt;p&gt;This front end can have any number of input methods such as input boxes, radio buttons which need to be relevant to the existing query.This front end hosting will be covered below.&lt;/p&gt;

&lt;p&gt;Here we can use a micro instance from the EC2 dashbord, from the link below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/ec2/getting-started/"&gt;Blogpost on micro instance deployment&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After launching the instance, we need to SSH into it, we need to install the necessary dependencies:&lt;/p&gt;

&lt;h4&gt;
  
  
  Dependencies:
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;AWS cli&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This provides control over most of the services on AWS. It must be installed as per the blogpost below to interact with Athena and other services:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html"&gt;AWS cli&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Flask with Virtual environment&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Link: &lt;a href="https://flask.palletsprojects.com/en/0.12.x/installation/"&gt;Install flask&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS data wrangler:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;After activating the virtual environment use the link below to install AWS data wrangler&lt;br&gt;
Link: &lt;a href="https://aws-data-wrangler.readthedocs.io/en/stable/"&gt;Install wrangler&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the instance is up and running we need to create a folder structure for our front end application:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp8LR15d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/154733004-5c409584-106f-4b1b-821b-7199a87684f4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp8LR15d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/28874545/154733004-5c409584-106f-4b1b-821b-7199a87684f4.png" alt="image" width="423" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Creating the index.html file:
&lt;/h4&gt;

&lt;p&gt;Here we need to create multiple text boxes. Each text box will contain an &lt;strong&gt;input filed&lt;/strong&gt; as well as a &lt;strong&gt;label&lt;/strong&gt; which can then be referenced in the &lt;strong&gt;app.py&lt;/strong&gt; file to execute the query.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;**index.html**
&amp;lt;html&amp;gt;
&amp;lt;head&amp;gt;&amp;lt;center&amp;gt;&amp;lt;h3&amp;gt;Ornament Verification Report &amp;lt;/h3&amp;gt;&amp;lt;/center&amp;gt;&amp;lt;/head&amp;gt;
&amp;lt;style&amp;gt;

.orn1
{
border: 1px outset blue;
background-color: lightblue;
padding-top: 10px;
padding-left: 10px;
padding-right: 10px;
padding-bottom: 10px;

}
&amp;lt;/style&amp;gt;
&amp;lt;body&amp;gt;

&amp;lt;form action="`{{url_for("test")}}`" method="post"&amp;gt;

    &amp;lt;div class="orn1"&amp;gt;
    &amp;lt;label for="ornament1"&amp;gt;Ornament1&amp;lt;/label&amp;gt;
    &amp;lt;input type="text" id="ornament1" name="ornament1" placeholder="enter b1/b2/b3/b4" required&amp;gt;
    &amp;amp;nbsp;

    &amp;lt;label for="lowercount"&amp;gt;Lowercount&amp;lt;/label&amp;gt;
    &amp;lt;input type="text" id="lowercount1" name="lowercount1" placeholder="enter lowercount" required&amp;gt;

    &amp;amp;nbsp;
    &amp;lt;label for="uppercount"&amp;gt;Uppercount&amp;lt;/label&amp;gt;
    &amp;lt;input type="text" id="uppercount1" name="uppercount1" placeholder="enter Uppercount" required&amp;gt;

    &amp;lt;br&amp;gt;
    &amp;lt;br&amp;gt;
    &amp;lt;label for="grossweightlowerrange"&amp;gt;Gross Weight Lower Range&amp;lt;/label&amp;gt;
    &amp;lt;input type="text" id="grossweightlowerrange1" name="grossweightlowerrange1" placeholder="enter lower range" required&amp;gt;
    &amp;amp;nbsp;&amp;amp;nbsp;
    &amp;lt;label for="grossweightupperrange"&amp;gt;Gross Weight Upper Range&amp;lt;/label&amp;gt;
    &amp;lt;input type="text" id="grossweightupperrange1" name="grossweightupperrange1" placeholder="enter upper range" required&amp;gt;

&amp;lt;/div&amp;gt;


    &amp;lt;br&amp;gt;

    &amp;lt;div class="submitbutton"&amp;gt;
    &amp;lt;button type="submit"&amp;gt;Execute&amp;lt;/button&amp;gt;&amp;lt;/div&amp;gt;
    &amp;lt;/form&amp;gt;
&amp;lt;/body&amp;gt;
    &amp;lt;/html&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The main logic will be developed in the backend where, based on the number of ornaments inputted the query will dynamically execute.&lt;/p&gt;

&lt;h4&gt;
  
  
  Creating the &lt;strong&gt;app.py&lt;/strong&gt; file with the execution logic
&lt;/h4&gt;

&lt;p&gt;Here we are referencing the &lt;strong&gt;ID&lt;/strong&gt; of the label from the HTML file&lt;/p&gt;

&lt;p&gt;Example :&lt;br&gt;
first_ornament=request.form.get("&lt;strong&gt;ornament1&lt;/strong&gt;")&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import time
from flask import Flask, request, render_template
import awswrangler as wr
# Flask constructor
app = Flask(__name__)

@app.route('/', methods=["GET", "POST"])
def gfg():
    if request.method == "POST":
        first_ornament=request.form.get("ornament1")
        second_ornament = request.form.get("ornament1")
        uppercount2 = request.form.get("uppercount1")
        lowercount2 = request.form.get("lowercount1")
        grossweightlowerrange2 = request.form.get("grossweightlowerrange1")
        grossweightupperrange2 = request.form.get("grossweightupperrange1")
        # Get the output into a dataframe by using **wr** which is the aws datawrangler package to interact with AWS
        df = wr.athena.read_sql_query("query", database="databasename")
        # Specify the path of the bucket where the result will be stored as **file1.csv**
        bucket = 'input_bucketname'
        path1 = f"s3://{bucket}/file1.csv"
        # Write the csv file to the S3 bucket
        wr.s3.to_csv(df, path1, index=False)
return render_template("index.html")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we have successfully taken input from the user-&amp;gt; Got back the result as a dataframe -&amp;gt; Converted it to a CSV -&amp;gt; Pushed it to s3&lt;/p&gt;

&lt;h2&gt;
  
  
  2. AWS S3 bucket, Glue, Athena, Quicksight:
&lt;/h2&gt;

&lt;p&gt;In the AWS user account we need to enable S3, Glue, Athena and Quicksight follo&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YK9Y0ooU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/beh5yyubr1v06ej1xjck.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YK9Y0ooU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/beh5yyubr1v06ej1xjck.jpg" alt="Image description" width="481" height="720"&gt;&lt;/a&gt;wing the blogs below in sequence.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html"&gt;S3&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/glue/latest/dg/what-is-glue.html"&gt;Glue&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/athena/latest/ug/what-is.html"&gt;Athena&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once all three services are launched for the respective user id, we need to enable the glue crawler to read the data into Athena as soon as it is put into S3.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/glue/latest/dg/monitor-glue.html"&gt;Glue Crawler architecture&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The data flow is as follows,&lt;/p&gt;

&lt;p&gt;Once the data sits in s3 it will trigger a glue crawler which is automatically configured to read the latest data from s3.&lt;/p&gt;

&lt;p&gt;Then a glue catalog schema is generated on the latest data in S3 which is then read by Quicksight.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open AWS glue and select &lt;strong&gt;Crawlers&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NwucxRyb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/AWS_Glue_Crawler.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NwucxRyb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/AWS_Glue_Crawler.png" alt="Glue Crawler" width="139" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In the crawler settings select the options as follows&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0hLFIEgQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/AWS_Glue_Crawler_Setting.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0hLFIEgQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/AWS_Glue_Crawler_Setting.png" alt="Glue crawler settings" width="633" height="289"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In athena create the DDL as shown below&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rw0jKfxZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/DDLgeneration.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rw0jKfxZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/DDLgeneration.png" alt="DDL" width="202" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xihuEb9W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/HariPranav/BENIMAGES/blob/master/AWSWrangler/DDL_execution.png%3Fraw%3Dtrue" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xihuEb9W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/HariPranav/BENIMAGES/blob/master/AWSWrangler/DDL_execution.png%3Fraw%3Dtrue" alt="ExecuteDDL" width="223" height="235"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In quicksight create a data set and point it to athena table as shown below&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Click on add a new dataset as shown below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lJ_Jbiet--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/Quicksight_Datasets.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lJ_Jbiet--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/Quicksight_Datasets.png" alt="Quicksight" width="352" height="706"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the same athena table where the ddl was generated as per the image below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O_O0DMYT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/Quicksight_Datasets_Addition.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O_O0DMYT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/AWSWrangler/Quicksight_Datasets_Addition.png" alt="Quicksight datasource" width="862" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>python</category>
      <category>aws</category>
      <category>datascience</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Apache Superset - A Swiss Army Knife for Analyzing Data </title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Mon, 21 Feb 2022 13:22:27 +0000</pubDate>
      <link>https://dev.to/haripranav/apache-superset-a-swiss-army-knife-for-analyzing-data-4kd9</link>
      <guid>https://dev.to/haripranav/apache-superset-a-swiss-army-knife-for-analyzing-data-4kd9</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkgv28t6ac28f46fmkcla.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkgv28t6ac28f46fmkcla.jpg" alt="Image description"&gt;&lt;/a&gt;# Apache Superset A Visualization Tool&lt;/p&gt;

&lt;p&gt;Apache Superset is a an open source visualization tool which provides out of the box integrations with a wide variety of databases and cloud platfroms. It can be easily deployed on EC2 machines and has great features which meet the production grade requirements like&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;POWERFUL YET EASY TO USE&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Quickly and easily integrate and explore your data, using either our simple no-code viz builder or state of the art SQL IDE.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;INTEGRATES WITH MODERN DATABASES&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset can connect to any SQL based data source through SQLAlchemy, including modern cloud native databases and engines at petabyte scale.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;MODERN ARCHITECTURE&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset is lightweight and highly scalable, leveraging the power of your existing data infrastructure without requiring yet another ingestion layer.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;RICH VISUALIZATIONS AND DASHBOARDS&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset ships with a wide array of beautiful visualizations. Our visualization plug-in architecture makes it easy to build custom visualizations that drop directly into Superset.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Superset is production ready and can be compared to various BI tools like Quicksight and Power BI and Metabase
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;ROW LEVEL SECURITY:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset allows us to manage access to the different rows in a particular database or table by giving the flexibility of specifying a field or column for which access is to be denied.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt; :&lt;/p&gt;

&lt;p&gt;Assuming we have a table from a database containing branch level data (1 lakh records) as the unique key, and if we have to generate analytics for each branch then that would mean we have to generate 1 lakh dashboards. Superset helps us to easily manage this scenario by allowing us to specify the various roles and the access to the data, which means that we can create a single dashboard and create views to the different users based on the roles.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;GRANULARITY IN PERMISSIONS:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset offers us the flexibility of creating ROLES and specifying the permissions for each role. There are multiple drag and drop permissions which can be easily added to a CUSTOM role ensuring fine granularity in the permissions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;When we have a large number of parties who want various views on a particular dashboard it becomes very difficult to create them. Hence Superset offers a granular approach based on ROLES which allows the admin to manage the groups and assign the users to the particular group as and when they are created.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AUTO MAILING FEATURE AND SHARING OF DATA:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset allows us to configure mailing services which automates the entire process of creation to sharing of data to the respective stakeholder’s email.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;It is often difficult to share the dashboards in a standard format when there a large number of parties involved. Hence in Superset we can give the features of auto mailing which allows the admin to automate the sending and mailing of the dashboards.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;INTEGRATION WITH ACTIVE DIRECTORY&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Superset allows us to integrate with various authentication services and integration with Active Directory using the LDAP protocol.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;If a company has &amp;gt; 100 employees, managing and creation of permissions with Row Level Security Manually for each of the records is a time consuming process and data entry errors are a pain to deal with&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;DATA REDUNDANCY AND AVAILABILITY:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;There are currently three ways of deploying Superset&lt;br&gt;
• DOCKER&lt;br&gt;
• KUBERNETES&lt;br&gt;
• PIP INSTALL&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;Scalability is a major concern as the number of users increases which usually causes slow run times as and when the number of users increases. Hence the power of Kubernetes allows us to manage multiple deployments of docker images on the cloud and also allows up to orchestrate and spin up containers as the traffic increases, which ensures fast load times and improves the User Experience.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;SQL LAB&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;SQL Lab allows us to directly connect the data sources from a variety of data sources and this connection allows us to prototype the various tables in real time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;USE CASE&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;When we have to create a visualizations, it is often done using Query logic. Hence Superset allows us to view the data, run the query and also visualize the prototype in a single window, this helps the developers to increase the productivity as all the information is available.&lt;/p&gt;

&lt;h3&gt;
  
  
  Installing Superset
&lt;/h3&gt;

&lt;p&gt;There are mainly three methods of installing Superset according to the documentation as mentioned above and here we are installing it on and EC2 Instance from AWS on a DOCKER IMAGE.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Let’s set up the EC2 Image on AWS&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Register with a free AWS account and navigate to the respective EC2 page and launch a free tier micro instance&lt;/li&gt;
&lt;li&gt;Let’s set up the EC2 Image on AWS&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154719725-6129b0cf-c005-4c6d-9ed0-d74ab1cb590b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154719725-6129b0cf-c005-4c6d-9ed0-d74ab1cb590b.png" alt="AWS Console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After clicking on SELECT in the image above leave all the options as default and the CONFIGURE SECURITY GROUP TAB we need to open port 8088 as Superset runs here as shown in the image below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720235-d7c17027-f81d-4d78-b602-7d199b6b9d5f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720235-d7c17027-f81d-4d78-b602-7d199b6b9d5f.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Login to the instance and proceed to install the docker image&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;The next step is to install DOCKER COMPOSE and DOCKER ENGINE from the links below for an ubuntu instance&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Docker Engine : &lt;a href="https://docs.docker.com/engine/install/ubuntu/" rel="noopener noreferrer"&gt;https://docs.docker.com/engine/install/ubuntu/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Docker Compose: &lt;a href="https://docs.docker.com/compose/install/" rel="noopener noreferrer"&gt;https://docs.docker.com/compose/install/&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Then we can install Superset according to the documentation below:
&lt;a href="https://superset.apache.org/docs/installation/installing-superset-using-docker-compose" rel="noopener noreferrer"&gt;https://superset.apache.org/docs/installation/installing-superset-using-docker-compose&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720586-38286035-aeec-4da1-a751-112441aa01e6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720586-38286035-aeec-4da1-a751-112441aa01e6.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now if we want to add a connector to a specific database then before the command&lt;/p&gt;

&lt;p&gt;$ docker-compose -f docker-compose-non-dev.yml up&lt;/p&gt;

&lt;p&gt;We need to install the respective connector as given in the link below ,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://superset.apache.org/docs/databases/dockeradddrivers" rel="noopener noreferrer"&gt;https://superset.apache.org/docs/databases/dockeradddrivers&lt;/a&gt;&lt;br&gt;
Here in our use case we need to install the Athena driver hence in the command,&lt;/p&gt;

&lt;p&gt;$echo “mysqlclient” &amp;gt;&amp;gt; ./docker/requirements-local.txt&lt;/p&gt;

&lt;p&gt;We need to replace “mysqlclient” with “pyathena” and the complete list of drivers can be found in the link below :&lt;br&gt;
&lt;a href="https://superset.apache.org/docs/databases/installing-database-drivers" rel="noopener noreferrer"&gt;https://superset.apache.org/docs/databases/installing-database-drivers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720911-ab2dc520-ceef-4ea8-9953-7483016ccff8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154720911-ab2dc520-ceef-4ea8-9953-7483016ccff8.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once Superset is installed we can login using the default user name and password which is admin : admin&lt;/p&gt;

&lt;h3&gt;
  
  
  Row Level Security on User Level
&lt;/h3&gt;

&lt;p&gt;Row level security provides an easy way to manage access to the various roles and permissions and access to the fine grained data. This can be implemented by going into the SETTINGS ICON and selecting the ROW LEVEL SECURITY as shown in the figure below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721354-72261ca0-ef43-4517-97b9-06ee9f325182.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721354-72261ca0-ef43-4517-97b9-06ee9f325182.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then click on new (+) icon on the right side and select all the fields that need to be added as shown in the image below :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721480-489a963c-f4d9-425a-862c-b09cbe79ecd5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721480-489a963c-f4d9-425a-862c-b09cbe79ecd5.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we can add a new user and specify this ROLE for them which ensures that the row level security is implemented and that they can see only those Records to which they have access.&lt;/p&gt;

&lt;h3&gt;
  
  
  Row Level Security on Data Set Level
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;USE CASE :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If we have a business in which there are about 10000 users and these users are a part of 100 branches and if we have to display a dashboard in such a way that only the current branch user can view their performance in the current quarter. i.e Each branch user must be able to view only their Branch performance and not other branches.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We have to manually set the Row level security for each user based on their user name as shown in the image below. Not only this, but also we need to manually add all the tables in the ROLE section for a particular user. In the future if we have more tables then we need to manually add them to the row level security as well.&lt;/p&gt;

&lt;p&gt;Although the documentation is not clear on how to solve this issue after a lot of research we found that there is a built in DYNAMIC FILTERING OPTION using the JINJA TEMPLATE. The line below uses the template to give the current username and this helps us to filter later in the DASHBORD LEVEL  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154963389-1f48e5bf-fc45-493f-a043-a9b1cf6a44be.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154963389-1f48e5bf-fc45-493f-a043-a9b1cf6a44be.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This needs to be enabled,&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;We need to login to our EC2 instance and locate the superset docker folder.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next we need to go to /superset/superset/config.py&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If we open this file using nano and then search using ctrl+w ENABLE_TEMPLATE_PROCESSING, we can see that it is set to false&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721969-0a9bef9e-7b40-4166-9a26-fa98c7904045.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154721969-0a9bef9e-7b40-4166-9a26-fa98c7904045.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Even though we set it to true it does not allow us to make any changes as superset is designed in such a way that we have to override this by making changes in another file called
superset_config.py.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722120-d89f1115-8ee1-4177-b24b-9fdcd10831e7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722120-d89f1115-8ee1-4177-b24b-9fdcd10831e7.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;What ever changes we make in this file is overwritten in the file called config.py.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To enable template processing we need to check the config.py&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722229-aa75bba3-a677-44a5-beed-4b01b67cb416.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722229-aa75bba3-a677-44a5-beed-4b01b67cb416.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It tells us to change the FEATURE_FLAGS to TRUE. Open /superset/docker/pythonpathdev/superset_config.py and search for FEATURE_FLAGS . Then in a new line enable template processing and restart the instance.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722338-a85a202c-dd21-4658-aba7-d7184a2a3d12.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722338-a85a202c-dd21-4658-aba7-d7184a2a3d12.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Once we restart the instance we can create a dataset in which we have a unique user name.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722456-016ca85c-ae85-4f1c-ac95-6008baaa9edd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722456-016ca85c-ae85-4f1c-ac95-6008baaa9edd.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Here we need to select edit dataset and then choose the legacy sql editor.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722706-27052659-ce5f-48a4-a277-a93800fb3067.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722706-27052659-ce5f-48a4-a277-a93800fb3067.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Then paste the string based on your use case. Here we want only the logged in user to view the current dashboards containing his/her username and not the dashboards of other users.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Refreshing a Dataset:
&lt;/h3&gt;

&lt;p&gt;Since we have enabled Athena Driver on the superset instance, if we have created dashboards on superset and if we change the source database or reload the data it causes issues in superset as old dashboard does not reflect the new dataset.&lt;/p&gt;

&lt;p&gt;This issue can be solved by,&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Clicking on edit dataset and navigating to the columns part&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722882-53b70c52-a144-4867-8980-0b62f93ddb1f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154722882-53b70c52-a144-4867-8980-0b62f93ddb1f.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723088-4faafa59-d5bc-4f7d-b131-cbebb95163c8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723088-4faafa59-d5bc-4f7d-b131-cbebb95163c8.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now if we click on Sync Columns from source all the old dashboards will reflect the new dataset.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Increasing Storage on EC2 as the datasets increase:
&lt;/h2&gt;

&lt;p&gt;Go to the console, and choose the volume, then go to modify and change the storage. This can only be increased and once done we cant downgrade.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723392-9f076929-ccad-4f2d-8fe7-e1183e0f25a1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723392-9f076929-ccad-4f2d-8fe7-e1183e0f25a1.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We then need to manually allocate the storage on the ec2 instance.&lt;/p&gt;

&lt;p&gt;$ Df -Th&lt;/p&gt;

&lt;p&gt;This command is used to get the increased volume in this case it is XVDA&lt;/p&gt;

&lt;p&gt;$growpart /dev/xvda 1&lt;/p&gt;

&lt;p&gt;This will increase the partision&lt;/p&gt;

&lt;p&gt;$resize2fs /dev/xvda1&lt;/p&gt;

&lt;h2&gt;
  
  
  User Analytics on Superset:
&lt;/h2&gt;

&lt;p&gt;The analytics can be viewed in the SQL lab:&lt;/p&gt;

&lt;p&gt;It is present under the PGSQL database, Schema: Public, table ab_user&lt;/p&gt;

&lt;p&gt;Query : SELECT * from ab_user where EXTRACT(MONTH FROM last_login )=10&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723908-0be2af80-b34f-4343-ae18-0ed4fb26f8b9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154723908-0be2af80-b34f-4343-ae18-0ed4fb26f8b9.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Embedding Dashboards on Another Application:
&lt;/h3&gt;

&lt;p&gt;First, you need to update the PUBLIC ROLE under Settings with these options.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;can explore json on Superset&lt;/li&gt;
&lt;li&gt;can dashboard on Superset, all database access on all_database_access.
Second, embed your dashboard in your HTML&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;iframe src="localhost:8088/superset/dashboard/5/?standalone=true" iframe&lt;/p&gt;

&lt;p&gt;Here ‘5’ in the URL specifies the dashboard number&lt;/p&gt;

&lt;h3&gt;
  
  
  Creation of Datasets:
&lt;/h3&gt;

&lt;p&gt;To create a Dashboard we need to follow the steps below:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the DATA tab&lt;/li&gt;
&lt;li&gt;Select create a DATASET button as shown in the image below&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724204-d241d9b3-6669-4bab-96cb-a48bc013915d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724204-d241d9b3-6669-4bab-96cb-a48bc013915d.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Select the database, the schema, and the table name&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Then choose the dataset and select the visualisation type as TABLE&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724420-5bea0458-d7a0-4b27-9f70-128cf013e208.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724420-5bea0458-d7a0-4b27-9f70-128cf013e208.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Add all the respective fields and run the query. After that save the result in a new DASHBOARD.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724546-fb2cb612-5dbf-45f6-bd95-0d535c0f7e5c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724546-fb2cb612-5dbf-45f6-bd95-0d535c0f7e5c.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Backup and Restore Superset Via Docker
&lt;/h3&gt;

&lt;p&gt;A docker image does not have persistent storage hence when the image is torn down or if the image crashes the entire persistent data stored in the DB will be terminated.&lt;br&gt;
We can find out more from the blog: &lt;a href="https://docs.docker.com/storage/volumes/" rel="noopener noreferrer"&gt;https://docs.docker.com/storage/volumes/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724701-3d5d5e57-0b73-4a31-907d-0b9b122b5a80.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154724701-3d5d5e57-0b73-4a31-907d-0b9b122b5a80.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hence If we need to take backups of the data, it is crucial to identify the mounting point of the docker image on the Operating System and on Ubuntu this can be found in the /var/lib/docker.&lt;br&gt;
STEPS TO BACKUP AND RESTORE A DATA IN DOCKER&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a copy of the current volume&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Cp -r [source] [destination]&lt;br&gt;
Here since we already have a backup in backup_vol we are copying it into /var/lib/docker&lt;/p&gt;

&lt;p&gt;$ sudo cp -r /home/ubuntu/backup_vol/docker /var/lib/docker&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Stop the services as well:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$sudo service docker stop&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Stop docker containers by using the command&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$ sudo docker stop $(sudo docker ps -q)&lt;/p&gt;

&lt;p&gt;Make sure all the services show as exited and this takes a bit of time&lt;/p&gt;

&lt;p&gt;Recheck by running&lt;br&gt;
$docker ps -a-non&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725127-863c24cf-f729-4de3-b010-cc2d4e3ef03d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725127-863c24cf-f729-4de3-b010-cc2d4e3ef03d.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Stop containers and remove containers, networks, volumes, and images created by up.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$ sudo docker container down&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;For testing purposes remove the current docker volume which is present in the file system by running&lt;br&gt;
$ sudo su&lt;br&gt;
$ cd /var/lib&lt;br&gt;
$ rm -R docker&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If the device is busy then we need to use umount command.&lt;br&gt;&lt;br&gt;
$umount overlay&lt;br&gt;
By doing this all the persistent data from the disk has been removed including the row level permissions that we had set.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725218-62e12c96-2316-4819-8547-82744036eb88.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725218-62e12c96-2316-4819-8547-82744036eb88.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725310-c7d8c802-597c-48da-ab35-73df146f3d44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725310-c7d8c802-597c-48da-ab35-73df146f3d44.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here there is no user with the row level security .&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now we have a backup in the home folder, which needs to be copied to /var/lib/docker&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;sudo cp -r /home/ubuntu/backup_vol/docker /var/lib/&lt;/p&gt;

&lt;p&gt;Here the name of the backup is docker in the folder backup_vol&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725420-a654732b-d647-40dd-89bb-ccde32397b9f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F28874545%2F154725420-a654732b-d647-40dd-89bb-ccde32397b9f.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now we need to restart it as it will cause errors as it will be reading the data from /var/lib/docker&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now use docker compose up and restart from scratch&lt;/p&gt;

&lt;p&gt;$ sudo service docker restart&lt;/p&gt;

&lt;p&gt;$ sudo docker-compose up&lt;/p&gt;

&lt;p&gt;If we get errors ctrl+c then wait for the service to go down, then go to the main superset folder and run the main command to start superset.&lt;/p&gt;

&lt;p&gt;$ cd superset&lt;br&gt;
$ sudo docker-compose -f docker-compose-non-dev.yml up&lt;/p&gt;

&lt;p&gt;This ensures that the volumes will be read from the correct location from the backup folder.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;This data needs to be backed up to S3 layer as well this can be done by using the tar command.
$cd /var/lib
tar -zcvf name_of_file_to_be_saved folder_name&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;• -z : Compress archive using gzip program in Linux or Unix&lt;/p&gt;

&lt;p&gt;• -c : Create archive on Linux&lt;br&gt;
• -v : Verbose i.e display progress while creating archive&lt;br&gt;
• -f : Archive File name&lt;/p&gt;

&lt;p&gt;$tar -zcvf prev_backup.tar.gz /var/lib/docker&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Then we need to install the AWS CLI,
&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html&lt;/a&gt;
After this we need to send the data to the S3 bucket. This can be done by copying the S3 bucket URI.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To copy the files from EC2 to S3&lt;br&gt;
$ aws s3 cp  s3://&lt;/p&gt;

&lt;p&gt;Hence if we have a file called prev_backup.tar.gz and s3 bucket URI then&lt;/p&gt;

&lt;p&gt;$ aws s3 cp /home/ubuntu s3://&lt;/p&gt;

&lt;p&gt;Now this data is sent to the respective bucket and is stored successfully.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;To restore this copy from S3&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$aws s3 cp s3:// &lt;/p&gt;

&lt;p&gt;Using tar unzip the file (USE SUDO SU and then UNZIP)&lt;/p&gt;

&lt;p&gt;$ tar -zxvf prev_backup.tar.gz&lt;/p&gt;

&lt;p&gt;$ sudo mv prev_backup/docker /var/lib&lt;/p&gt;

&lt;p&gt;Du – display disk usage:&lt;br&gt;
$du -h foldername&lt;/p&gt;

&lt;p&gt;Df- Display file system file size&lt;br&gt;
$df -f (Shows the full file system file size)&lt;/p&gt;

&lt;p&gt;Superset Reset automation:&lt;br&gt;
&lt;a href="https://gist.github.com/pajachiet/62eb85805cee55053d208521e0bdaf13/revisions" rel="noopener noreferrer"&gt;https://gist.github.com/pajachiet/62eb85805cee55053d208521e0bdaf13/revisions&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Automation of Superset backup to AWS.
&lt;/h1&gt;

&lt;p&gt;1.Create a script called automate_backup.sh. And give necessary permissions.&lt;/p&gt;

&lt;p&gt;$nano automate_backup.sh&lt;/p&gt;

&lt;p&gt;This script copies the entire docker folder in /var/lib to the home directory, then compresses it&lt;/p&gt;

&lt;p&gt;HOME=/root LOGNAME=root PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin LANG=en_US.UTF-8 SHELL=bin/sh PWD=/root #!/bin/bash&lt;br&gt;
 sudo cp -r /var/lib/docker /home/ubuntu/prev_backup &lt;br&gt;
 tar -zcvf /home/ubuntu/prev_backup.tar.gz /home/ubuntu/prev_backup/docker&lt;/p&gt;

&lt;p&gt;$sudo chmod 777 automate_backup.sh&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Make sure that the aws cli is configured in the instance and the Access code and key is set.
&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$nano automate_s3_storage.sh&lt;/p&gt;

&lt;h1&gt;
  
  
  This script copies the zipped data from Ec2 to S3. We have to specify the s3 URI by creating a folder
&lt;/h1&gt;

&lt;p&gt;aws s3 cp /home/ubuntu/prev_backup.tar.gz s3://bucket_url&lt;/p&gt;

&lt;p&gt;$sudo chmod 777 automate_s3_storage.sh&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Schedule the cron job to run the first script (automate_backup.sh)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;$ cron tab –e&lt;/p&gt;

&lt;p&gt;#this creates a local backup and it is stored as a tar file every day at 12 AM IST or 5:30am UTC&lt;/p&gt;

&lt;p&gt;30 5 * * _ /home/ubuntu/automate_backup_test.sh&lt;/p&gt;

&lt;p&gt;#this sends the backup to aws on every seventh day at 12 AM IST or 5:30 am UTC&lt;/p&gt;

&lt;p&gt;30 5 _ * 0 /home/ubuntu/automate_s3_storage.sh&lt;/p&gt;

&lt;p&gt;#check the status of the cron tab&lt;/p&gt;

&lt;p&gt;$ cron tab -l&lt;/p&gt;

&lt;p&gt;By following all these steps the data is successfully backed up into s3 on a periodic basis. Due to the compression the size of the file is reduced by a factor of 4. (15 GB file is reduced to 3.7Gb)&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>analytics</category>
      <category>docker</category>
      <category>superset</category>
    </item>
    <item>
      <title>Beginners Guide to Hackathons</title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Wed, 14 Apr 2021 06:41:35 +0000</pubDate>
      <link>https://dev.to/haripranav/beginners-guide-to-hackathons-33od</link>
      <guid>https://dev.to/haripranav/beginners-guide-to-hackathons-33od</guid>
      <description>&lt;p&gt;&lt;a href="https://i.giphy.com/media/Lqlv6AA6hgWnszMz4w/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/Lqlv6AA6hgWnszMz4w/giphy.gif" alt="Hackathons"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What is a Hackathon ??&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;A hackathon is an event in which a team of open minded ,creative people come together and try to solve a real world problem . A "Team" can consist of a mix of non-technical people , designers , thinkers, artists , developers who are diverse in their age as well as skill sets .Some of them may be  in their 7th grade and some of them may have a  PhD.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;These  events  usually spans for 24 , 48 hours  and they are usually conducted on the weekends .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In this limited time the problem statement is given on spot  and sometimes the team formation is done in the event itself .&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Why Should You Attend a Hackathon ??
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;To put into perspective imagine that you are attending your first "HACK"  and you get paired up with a team of people who you have never met before , you are not familiar with each others skills ,strengths and weaknesses, How will you manage to come with a solution to a given problem within the time limit specified ??&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Sounds challenging right , Don't worry these events will help you develop some of the core skills needed in the industry which can be learnt in the the course of a night or two. The skills that you will develop are:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Communication&lt;/li&gt;
&lt;li&gt;Leadership&lt;/li&gt;
&lt;li&gt;Critical Thinking&lt;/li&gt;
&lt;li&gt;Teamwork&lt;/li&gt;
&lt;li&gt;Coordination&lt;/li&gt;
&lt;li&gt;Deal with conflicts&lt;/li&gt;
&lt;li&gt;Crisis Management&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/XYrHWGJPtaQMM/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/XYrHWGJPtaQMM/giphy.gif" alt="GOT"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We can also consider hackathons as  a crash course in becoming a &lt;strong&gt;CREATOR&lt;/strong&gt; or an &lt;strong&gt;Entrepreneur&lt;/strong&gt; or  a &lt;strong&gt;MAKER&lt;/strong&gt; !!.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Sounds Wicked Right ??&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Apart from this many of the attendees come to hackathons to "Network" . Networking is considered a very effective way of marketing a company , it also helps to recruit new talent and also meet new creative people with whom you may form a lifelong journey of building products.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The  amazing part of a hackathon is that you can find people who have the same "Vibe" and way of thinking as you do.&lt;br&gt;
Plus if you are  a student you get free attendance, Free merch ,Free Swag and tons of confidence and knowledge that you can take away from the event .&lt;br&gt;
What's there to lose ????&lt;br&gt;
Register For one immediately !!!!&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Types of Hackathons :
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Internal Hackathons :
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;These are usually within the companies and they encourage the employees to collaborate with each other.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  External Hackathons :
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;This is the one to look out for especially for students and people looking out for jobs ,Companies are changing their hiring strategy by putting up challenges for potential employees and students to take part in .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Main reason is that the companies want to test the potential employees capability in solving a problem in a limited period of time with limited knowledge and resources , this model of employment also greatly reduces the burden on the company as they don't have to allocate extra budget for training and recruitment of their employees .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In today's world of startup's it is crucial to adapt to changes by doing code sprints which last all night long!&lt;br&gt;
What better to experience this culture of the "Startup Culture" .&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Online and Offline Hackathons:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;They can be online in the various online platforms such as HackerEarh , MLH ..&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Personally I really enjoy attending the offline hackathons as they are mostly held in beautiful venues like Co-Working spaces and huge office campuses such as Microsoft and Google .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The entire space is given towards the event in which  you can experience the culture , the delicious food ,this opens up a new level of work culture , many venues have game centers with games like pool and table tennis , dedicated gyms and a great cafeteria .&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Tools For a Hack
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Github :
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;What is github ?? in simple terms imagine that multiple people are working on a project with new ideas and updates , How do you ensure that two people have not done the same work twice , or how do you assure that the version of the code is the latest one free of bugs ,
Introducing Version Control , Github helps teams to manage the entire process of writing and deploying code .
The best part is Github offers their student pack which gives students with a valid ID free goodies/tools worth 200k $&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Get yours here&lt;/p&gt;

&lt;p&gt;&lt;a href="https://education.github.com/pack"&gt;Github student pack&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Discord (Free Voice chat for Gamers)
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;What good is a hackathon without collaboration and communication ??&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Discord is one of the most versatile tools that can be used in a hack ,&lt;br&gt;
-You can create private servers&lt;br&gt;
-Create voice channels&lt;br&gt;
-Bots&lt;br&gt;
-Music Channels&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://discord.com/"&gt;Discord&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Google Suite(Slides,Docs,Tasks,Keep)
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The entire GSuite is a blessing to have in a hack , The true value of real time collaboration can only be appreciated when there are only 15 minutes to submit the solution to a hack !!! XD .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;These tools are a life saver , when most of the participants have different operating systems with different word processors and applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It takes away the pain of transferring files on usb.&lt;br&gt;
One can even collaboratively write books and articles ,brainstorm ideas using these tools.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The only thing is that a stable internet is necessary as these are mostly cloud based .&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://gsuite.google.com/"&gt;Gsuite&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Official Docs,StackOverflow,YouTube,Reddit,Google
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The official Guides may have a lot of resources , but a beginner might find it difficult to find out how to use it.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Stackoverflow and Youtube tutorials to the rescue !! the support , solutions and videos on these platforms can be used to develop a quick prototype !!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;These tools are like your personal Genie , Error in your code ,Just paste the error to get a solution :&amp;gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://stackoverflow.com/"&gt;StackOverflow&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.reddit.com/"&gt;Reddit&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Magic Buzz Words That Win a Hack
&lt;/h1&gt;

&lt;h2&gt;
  
  
  BAT - Blockchain , IOT , AI
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;In most of the hacks if the teams have implemented a solution with these technologies and if they actually make sense then it can surely secure them a spot in the top 5 hacks.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lets consider a scenario of  a problem statement related to Farming&lt;/p&gt;

&lt;p&gt;We can easily use these technologies as shown below&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;IOT-Sensors send the data to the cloud where the plant health can be monitored&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AI / Machine Learning- Analise the data and build models to predict , and improve the efficiency of the crop growth.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Blockchain: Immutable ledger to record the data and land records !!&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here you go you are in the top three teams for the most innovative idea :)&lt;/p&gt;

&lt;p&gt;Your Hack will  look something like this !!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/e43MD18f4wdZC/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/e43MD18f4wdZC/giphy.gif" alt="Homer"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Choosing Teammates
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tLBqzrKV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgflip.com/2r5ucr.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tLBqzrKV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgflip.com/2r5ucr.jpg" alt="LOLXD"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The best way to sail through a hack is to form a team a day before the event begins and always have a backup team member in case someone bails .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It is always good to have someone in the team who knows how to make really good ppts, and also people who can express the project with humour , basically find someone who knows their Memes ;&amp;gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The above point cannot be stressed enough because the  main judging criteria of a hack is usually how the presentation is done , just imagine the fate of the judges , they have to select the best team from a group of 300 odd people .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Humor and presentation skills , answering to the point will definitely help your team leave a good impression on the &lt;strong&gt;fatigued judges&lt;/strong&gt; :)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Last but not least  have an app / web developer who can quickly prototype a webpage/app to demonstrate the workflow of your idea.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Coming to the most important part about a hack !!
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Never quit in the middle
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;It is understandable that a beginner who has never been or heard of a hackathon cannot come up with a fully working product / prototype in a single night ,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The best part of a hackathon is that you don't need to come up with a fully working solution !!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The judging criteria will have many parameters and most of the marks that distinguish a better solution is uniqueness of the idea  , method of implementation  , presentation and workflow .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It is totally understandable if you didn't finish what you intended to do  , but always make sure that you can present what you intended to do with confidence to the judges , work on your idea to such an extent that it is foolproof , use diagrams/charts/illustrations   to describe your workflow !!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;And at the end after all the hard work even if you lose or feel disheartened do not give up ,&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/ZIzN7YWNuTUYg/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/ZIzN7YWNuTUYg/giphy.gif" alt="Research"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Take home the idea or hack and keep working on it , Upload it to git , Write an article or a blog-post about the idea .&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Describe the difficulties faced and the solutions for the various problems faced, this will help future hackers on their journey as well !!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally you can also apply to different hacks with the same idea (if it is allowed) and hopefully &lt;strong&gt;Luck&lt;/strong&gt; will strike you there !!&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some resources to kick start your journey &lt;/p&gt;

&lt;p&gt;&lt;a href="https://mlh.io/"&gt;MLH&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.techgig.com/hackathon/ml_hackathon"&gt;TechGig&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dorahacks.com/"&gt;DoraHacks&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.hackerearth.com/hackathon/explore/field/machine-learning/"&gt;HackerEarth&lt;/a&gt;&lt;br&gt;
&lt;a href="https://angelhack.com/"&gt;AngelHack&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Cheers !!!&lt;/p&gt;

&lt;h3&gt;
  
  
  Keeping all of this in mind I wanted to develop an app for beginners to start of on their hackathon journey
&lt;/h3&gt;

&lt;p&gt;All of this and more in the next Article&lt;/p&gt;

</description>
      <category>hackathon</category>
      <category>computerscience</category>
      <category>software</category>
    </item>
    <item>
      <title>Documentation For Projects</title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Tue, 13 Apr 2021 15:59:54 +0000</pubDate>
      <link>https://dev.to/haripranav/documentation-for-projects-4f43</link>
      <guid>https://dev.to/haripranav/documentation-for-projects-4f43</guid>
      <description>&lt;h2&gt;
  
  
  What is the importance of Documentation ??
&lt;/h2&gt;

&lt;p&gt;Documentation describes the version   ,configuration , compatibility and the various dependencies ,prerequisites  that a project requires to work .&lt;/p&gt;

&lt;p&gt;Documentation helps other developers understand the various limitations and use cases of a project and how to use the various parts of the project to their advantage .&lt;/p&gt;

&lt;p&gt;Hence in a nutshell Documentation provides a detailed guide to using a project and all companies big or small have documentation !!&lt;/p&gt;

&lt;h2&gt;
  
  
  What is markdown ??
&lt;/h2&gt;

&lt;p&gt;Have you used HTML ?? It sometimes gets so messy with all those tags when we just to need to have a &lt;strong&gt;static&lt;/strong&gt; page with minimal functionality&lt;/p&gt;

&lt;p&gt;Introducing Markdown !&lt;/p&gt;

&lt;p&gt;The main functionality of markdown can be found below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://markdown-guide.readthedocs.io/en/latest/basics.html"&gt;https://markdown-guide.readthedocs.io/en/latest/basics.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lets see some basics to bootstrap a blog page&lt;/p&gt;

&lt;h1&gt;
  
  
  Headings
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--B42bGFHs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/janosgyerik/writing-markdown-well/master/screenshots/headings-on-stackoverflow.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B42bGFHs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/janosgyerik/writing-markdown-well/master/screenshots/headings-on-stackoverflow.png" alt="Headings"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We use a single &lt;strong&gt;hash&lt;/strong&gt; before text for the largest font  similar to &lt;strong&gt;h1&lt;/strong&gt; in html&lt;/p&gt;

&lt;p&gt;Similarly if we want a &lt;strong&gt;h2&lt;/strong&gt; we use two hashes before the text we want !&lt;/p&gt;

&lt;h1&gt;
  
  
  Images from the web or local machine
&lt;/h1&gt;

&lt;p&gt;Use an exclamation mark followed by square brackets with the text and round braces for the url&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wFpFr7Rp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/images.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wFpFr7Rp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/images.png" alt="Images"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;eg  : &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--voui7ShJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blockchainedu.org/static/media/ben-full-logo.3f23ec07.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--voui7ShJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blockchainedu.org/static/media/ben-full-logo.3f23ec07.png" alt="BEN Image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Hyperlinks
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--E2ovZJyu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/links%2520.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--E2ovZJyu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://raw.githubusercontent.com/HariPranav/BENIMAGES/master/links%2520.png" alt="Hyperlinks"&gt;&lt;/a&gt;&lt;br&gt;
Square brackets with the text and round braces for the url&lt;/p&gt;

&lt;p&gt;eg: &lt;a href="https://blockchainedu.org/"&gt;BEN_Website&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  MkDocs
&lt;/h1&gt;

&lt;p&gt;MkDocs is a fast, simple and downright gorgeous static site generator that's geared towards building project documentation. Documentation source files are written in Markdown, and configured with a single YAML configuration file. Start by reading the introduction below, then check the User Guide for more info.&lt;/p&gt;

&lt;h1&gt;
  
  
  Running MkDocs in Local Machine
&lt;/h1&gt;

&lt;p&gt;In order to manually install MkDocs you'll need Python installed on your system, as well as the Python package manager, pip. You can check if you have these already installed from the command line:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ python --version

$ pip --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If you don't have python already installed download it from here&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.python.org/downloads/"&gt;pythonDownload&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$pip --version if it is not installed you can find it in the link below
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://packaging.python.org/tutorials/installing-packages/"&gt;pipDownload&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Install mkdocs using pip&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ pip install mkdocs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Run&lt;/p&gt;

&lt;p&gt;$ mkdocs --version&lt;/p&gt;

&lt;p&gt;to check that everything worked okay.&lt;/p&gt;

&lt;p&gt;If this does not work then python is not added to your path a neat hack around this is to run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ py -m mkdocs --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h1&gt;
  
  
  Creating a new project
&lt;/h1&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ mkdocs new my-project
$ cd my-project
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Open a new terminal and open the "my-project" folder there you will find a file called index.md&lt;/p&gt;

&lt;p&gt;Edit the file using a text editor then  run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ mkdocs serve
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;or&lt;br&gt;
    $ py -m mkdocs serve&lt;/p&gt;

&lt;p&gt;Your static site is now up and running&lt;br&gt;
Open up your browser and go to&lt;/p&gt;

&lt;p&gt;&lt;a href="http://127.0.0.1:8000"&gt;http://127.0.0.1:8000&lt;/a&gt;, to view the webpage&lt;/p&gt;

&lt;p&gt;Now let's add some content to it and host it on GitHub !!&lt;/p&gt;

&lt;h1&gt;
  
  
  Pushing the site to git
&lt;/h1&gt;

&lt;p&gt;Install git if you haven't from the link below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://git-scm.com/downloads"&gt;git&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go to github and create a new repository with the same name as the project folder in this case &lt;strong&gt;my-project&lt;/strong&gt;. This step is crucial as we need to deploy the site to the correct repository.&lt;/p&gt;

&lt;p&gt;Sign in and create a new repository&lt;/p&gt;

&lt;p&gt;Add the name&lt;/p&gt;

&lt;p&gt;And a short description&lt;/p&gt;

&lt;p&gt;Leave the initialize a Readme as unticked and say create repository&lt;/p&gt;

&lt;p&gt;Now open the command Line and&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ git init .

$ git commit -m "First Commit"

$ git remote add origin PASTE YOUR URL 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ py -m mkdocs gh-deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now your site is deployed to git Hurray !!&lt;/p&gt;

&lt;h1&gt;
  
  
  Publishing on Medium
&lt;/h1&gt;

&lt;p&gt;Medium is a good platform to publish blogs and articles on various subjects , go to&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/"&gt;Medium_Website&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sign in with your Gmail or create a new account and click on create a new story&lt;/p&gt;

&lt;p&gt;The disadvantage with Medium is that , it does not have direct support for markdown files so we can use a third party website , go to&lt;/p&gt;

&lt;p&gt;&lt;a href="https://markdowntomedium.com/create"&gt;MarkdownToMedium&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and follow the steps ,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;paste your markdown and convert into a GitHub gist and connect to Medium&lt;/li&gt;
&lt;li&gt;paste the link in  Medium&lt;/li&gt;
&lt;li&gt;import the link and your site is now published to Medium!!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Before publishing to Medium make sure that all the content from the markdown file has come in the correct order, as sometimes there are bugs at the time of importing gists from git.&lt;/p&gt;

&lt;h1&gt;
  
  
  Decentralised Hosting on IPFS
&lt;/h1&gt;

&lt;h2&gt;
  
  
  What is IPFS ??
&lt;/h2&gt;

&lt;p&gt;A peer-to-peer hypermedia protocol&lt;br&gt;
designed to make the web faster, safer, and more open.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://ipfs.io/"&gt;IPFS&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is fleek ??
&lt;/h2&gt;

&lt;p&gt;On IPFS , Fleek provides a beautiful way to integrate with Git so that whenever we make changes to our branches it can be directly&lt;br&gt;
updated on Fleek!!!!&lt;/p&gt;

&lt;h3&gt;
  
  
  Hosting on Fleek
&lt;/h3&gt;

&lt;p&gt;Click on the link below to know more here we can directly login with our git account and host our  website.&lt;br&gt;
Click on sign in and sign in with github , now add the repository containing your github page and your done!!.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://fleek.co/"&gt;Fleek&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Different CMS to host sites on fleek Link Below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.fleek.co/"&gt;Fleek Blog posts&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can make changes from time to time to update the webpage on git by pushing the pages to git , Fleek will automatically implement these changes from the gh-pages branch from git !!&lt;/p&gt;

&lt;p&gt;The best part is that we can make multiple pages with a single GitHub ID!!&lt;/p&gt;

&lt;p&gt;Awesome we now have our very own &lt;strong&gt;documentation&lt;/strong&gt; website hosted on a &lt;strong&gt;centralized&lt;/strong&gt; as well as a &lt;strong&gt;Decentralized Service!!&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>programming</category>
      <category>documentation</category>
      <category>python</category>
    </item>
    <item>
      <title>Decentralized Portfolio Website</title>
      <dc:creator>Hari Pranav A</dc:creator>
      <pubDate>Tue, 13 Apr 2021 15:50:50 +0000</pubDate>
      <link>https://dev.to/haripranav/decentralized-portfolio-website-ii0</link>
      <guid>https://dev.to/haripranav/decentralized-portfolio-website-ii0</guid>
      <description>&lt;h2&gt;
  
  
  What is a Portfolio Website ??
&lt;/h2&gt;

&lt;p&gt;A portfolio website is an essential tool to getting more business and building your professional brand. In today’s digital world, a portfolio is arguably more important than a resume, no matter what industry you work in. Whether you are a freelance journalist, a recent college grad looking for a job, or even an accountant, when people Google your name, your portfolio will provide the most powerful and comprehensive perspective on you.&lt;/p&gt;

&lt;h2&gt;
  
  
  Awesome where do I start ?
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Lets choose a tech stack
&lt;/h3&gt;

&lt;p&gt;It can be just pure HTML , CSS or can we can use JS frameworks like React or Angular or Vue Ember or Meteor or Mithril the list goes on and on for Javascript &lt;strong&gt;FacePalm&lt;/strong&gt;  But here we are going to use Flutter !!!&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Flutter ??
&lt;/h3&gt;

&lt;p&gt;Flutter is Google’s UI toolkit for building beautiful, natively compiled applications for mobile, web, and desktop from a single codebase.&lt;/p&gt;

&lt;p&gt;Say what?? a single framework for web , mobile , desktop ??  Even on a Raspberry Pi : &amp;gt;&lt;/p&gt;

&lt;p&gt;The basics of Flutter can be learned in the link below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/HariPranav/Flutter" rel="noopener noreferrer"&gt;Flutter Basics&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/HariPranav/flutter_learning_path" rel="noopener noreferrer"&gt;Flutter Advanced&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Without installing Flutter on Local Machine
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2FHariPranav%2FBENIMAGES%2Fmaster%2FScreen1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2FHariPranav%2FBENIMAGES%2Fmaster%2FScreen1.png" alt="Basic UI"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go to the link below and paste the code below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/HariPranav/Flutter" rel="noopener noreferrer"&gt;DartPad&lt;/a&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:flutter/material.dart';


void main()
{
    runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() =&amp;gt; _MyAppState();
}

class _MyAppState extends State&amp;lt;MyApp&amp;gt; {
 @override
Widget build(BuildContext context) {
return MaterialApp(
  debugShowCheckedModeBanner: false,
  home: Scaffold(
    body: Container(
      decoration: BoxDecoration(
        image: DecorationImage(
          image: NetworkImage("https://images-wixmp-ed30a86b8c4ca887773594c2.wixmp.com/i/0052d15d-4c44-4d0f-aade-45074bff0633/d9erbf9-27735655-772d-437a-ae35-ef67e1ea9d10.jpg"),
          fit: BoxFit.fill
          ,
        ),
      ),
      child: ListView(children: [
        Center(
          child: Container(
            margin: EdgeInsets.all(40),
            height: 150,
            width: 150,
            decoration: BoxDecoration(
              shape: BoxShape.circle,
              image: DecorationImage(
                  image: NetworkImage('https://scontent.fblr2-1.fna.fbcdn.net/v/t1.0-1/c0.120.320.320a/p320x320/45144634_1855611297884897_6210030828886425600_n.jpg?_nc_cat=105&amp;amp;_nc_sid=7206a8&amp;amp;_nc_oc=AQmllP_29URGPDyNhajv3rw2YzkqMeeukAQiwgSKm4XJRBhNI533aqwYd13gJN8DdS0&amp;amp;_nc_ht=scontent.fblr2-1.fna&amp;amp;oh=2919360f49c924ee76c6c2f60fc56325&amp;amp;oe=5F1416AE'),
                  fit: BoxFit.contain),
            ),
          ),
        ),
        Center(
          child: Container(
            decoration: BoxDecoration(
              borderRadius: BorderRadius.all(Radius.circular(20)),
              gradient: LinearGradient(
                begin: Alignment.bottomLeft,
                end: Alignment.bottomRight,
                colors: [Colors.green[50], Colors.red[300]],
              ),
            ),
            margin: EdgeInsets.all(10),
            padding: EdgeInsets.all(5),
            child: Text(
              " Flutter | Blockchain | Forensics",
              style: TextStyle(
                color: Colors.black,
                fontSize: 20,
              ),
            ),
          ),
        ),
        Center(
          child: Container(
            decoration: BoxDecoration(
              borderRadius: BorderRadius.all(Radius.circular(20)),
              gradient: LinearGradient(
                begin: Alignment.bottomRight,
                end: Alignment.bottomLeft,
                colors: [Colors.green[50], Colors.red[300]],
              ),
            ),
            margin: EdgeInsets.all(10),
            padding: EdgeInsets.all(5),
            child: Row(mainAxisSize: MainAxisSize.min, children: &amp;lt;Widget&amp;gt;[
              InkWell(
                  child: Image.network(
                    "https://cdn.icon-icons.com/icons2/936/PNG/128/github-logo_icon-icons.com_73546.png",
                    height: 40,
                    width: 40,
                  ),
//                       onTap: () =&amp;gt; launch('https://github.com/HariPranav')
              ),
              SizedBox(
                width: 8,
              ),
              InkWell(
                child: Image.network(
                  "https://cdn.icon-icons.com/icons2/31/PNG/128/sociallinkedin_member_2751.png",
                  height: 40,
                  width: 40,
                ),
//                     onTap: () =&amp;gt; launch(
//                         'https://in.linkedin.com/in/hari-pranav-77b067162'),
              ),
              SizedBox(
                width: 8,
              ),
              InkWell(
                child: Image.network(
                  "https://cdn.icon-icons.com/icons2/555/PNG/128/facebook_icon-icons.com_53612.png",
                  height: 40,
                  width: 40,
                ),
    //                     onTap: () =&amp;gt;
    //                         launch('https://www.facebook.com/hari.pranav.1'),
              ),
              SizedBox(
                width: 8,
              ),
              InkWell(
                child: Image.network(
                  "https://cdn.icon-icons.com/icons2/122/PNG/128/twitter_socialnetwork_20007.png",
                  height: 40,
                  width: 40,
                ),
    //                     onTap: () =&amp;gt; launch('https://twitter.com/aharipranav'),
              ),
              SizedBox(
                width: 8,
              ),
              InkWell(
                child: Image.network(
                  "https://cdn.icon-icons.com/icons2/195/PNG/128/YouTube_23392.png",
                  height: 40,
                  width: 40,
                ),
    //                     onTap: () =&amp;gt; launch(
    //                         'https://www.youtube.com/channel/UCWYIT8-ScPTy-fWSa3ff6_A?view_as=subscriber'),
              ),
            ]),
          ),
        ),
      ]),
    ),
    ),
    );
 }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Here we are using the online tool called dartpad to render the UI into the browser , you might have observed that the on Tap handlers in the code are commented this is because we need to install a package which is currently not supported, so when we click a social media icon it doesn't link anywhere. Hence we need to install flutter and dart in our local environment to add hyperlinking functionality!!&lt;/p&gt;

&lt;h1&gt;
  
  
  With Flutter on Local Machine
&lt;/h1&gt;

&lt;p&gt;Installing Flutter  : &lt;a href="https://flutter.dev/docs/get-started/install" rel="noopener noreferrer"&gt;Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1 :
&lt;/h3&gt;

&lt;p&gt;After installing flutter we need to build the application on the web so change the branch to web&lt;/p&gt;

&lt;p&gt;$flutter channel master&lt;/p&gt;

&lt;p&gt;$flutter upgrade&lt;/p&gt;

&lt;p&gt;$flutter config --enable-web&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2 :
&lt;/h3&gt;

&lt;p&gt;Run&lt;/p&gt;

&lt;p&gt;$flutter devices&lt;/p&gt;

&lt;p&gt;Here we should see chrome as the devices available , PS : if it doesnt show up check if you have the latest version of chrome&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3  :
&lt;/h3&gt;

&lt;p&gt;Run:&lt;/p&gt;

&lt;p&gt;$flutter create &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4  :
&lt;/h3&gt;

&lt;p&gt;Lets get coding !!!&lt;/p&gt;

&lt;p&gt;Open the pubspec.yaml and go to the commented section under dependencies&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    dependencies:

    flutter:
        sdk: flutter

        // paste this code below

    url_launcher: ^5.4.10
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Go to the terminal and type&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$pub get
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This installs the dependencies&lt;/p&gt;

&lt;p&gt;Next Open the main.dart under the lib directory in your project&lt;/p&gt;

&lt;p&gt;Then paste the code below&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// importing the libraries

import 'package:flutter/material.dart';
import 'package:url_launcher/url_launcher.dart';

// creating a stateful widget
// type stf and it auto creates one (Works in vs code and android studion with )

void main()
{
    runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() =&amp;gt; _MyAppState();
}

class _MyAppState extends State&amp;lt;MyApp&amp;gt; {
@override
Widget build(BuildContext context) {
    return MaterialApp(
        // we check the debug banner to be false
    debugShowCheckedModeBanner: false,
    // the scaffold basically provides predefined //material components to be used in UI
    home: Scaffold(
        // we use a container to display the Image
        body: Container(
        decoration: BoxDecoration(
            image: DecorationImage(
                // create a folder called assets in the parent directory of your project and put your profile photo to be displayed
            image: AssetImage("assets/ss2.jpg"),
            fit: BoxFit.cover,
            ),
        ),
        child: ListView(children: [
            Center(
            child: Container(
                margin: EdgeInsets.all(40),
                height: 150,
                width: 150,
                decoration: BoxDecoration(
                shape: BoxShape.circle,
                image: DecorationImage(
                    image: AssetImage('assets/HP.png'), fit: BoxFit.contain),
                ),
            ),
            ),
            Center(
            child: Container(
                // we use the famous Linear Gradient
                decoration: BoxDecoration(
                borderRadius: BorderRadius.all(Radius.circular(20)),
                gradient: LinearGradient(
                    begin: Alignment.bottomLeft,
                    end: Alignment.bottomRight,
                    colors: [Colors.green[50], Colors.red[300]],
                ),
                ),
                margin: EdgeInsets.all(10),
                padding: EdgeInsets.all(5),
                child: Text(
                " Flutter | Blockchain | Forensics",
                style: TextStyle(
                    color: Colors.black,
                    fontSize: 20,
                ),
                ),
            ),
            ),
            Center(
            child: Container(
                decoration: BoxDecoration(
                borderRadius: BorderRadius.all(Radius.circular(20)),
                gradient: LinearGradient(
                    begin: Alignment.bottomRight,
                    end: Alignment.bottomLeft,
                    colors: [Colors.green[50], Colors.red[300]],
                ),
                ),
                margin: EdgeInsets.all(10),
                padding: EdgeInsets.all(5),
                child: Row(mainAxisSize: MainAxisSize.min, children: &amp;lt;Widget&amp;gt;[
                    // here we add the links to our varoius social media websites
                InkWell(
                    child: Image.asset(
                        "assets/github.png",
                        height: 40,
                        width: 40,
                    ),
                    onTap: () =&amp;gt; launch('https://github.com/HariPranav')),
                SizedBox(
                    width: 8,
                ),
                InkWell(
                    child: Image.asset(
                    "assets/linkedin-square-color.png",
                    height: 40,
                    width: 40,
                    ),
                    onTap: () =&amp;gt; launch(
                        'https://in.linkedin.com/in/hari-pranav-77b067162'),
                ),
                SizedBox(
                    width: 8,
                ),
                InkWell(
                    child: Image.asset(
                    "assets/facebook-round-color.png",
                    height: 40,
                    width: 40,
                    ),
                    onTap: () =&amp;gt; launch(
                        'https://www.facebook.com/hari.pranav.1'),
                ),
                SizedBox(
                    width: 8,
                ),
                InkWell(
                    child: Image.asset(
                    "assets/twitter-color.png",
                    height: 40,
                    width: 40,
                    ),
                    onTap: () =&amp;gt; launch('https://twitter.com/aharipranav'),
                ),
                SizedBox(
                    width: 8,
                ),
                InkWell(
                    child: Image.asset(
                    "assets/youtube-color.png",
                    height: 40,
                    width: 40,
                    ),
                    onTap: () =&amp;gt; launch(
                        'https://www.youtube.com/channel/UCWYIT8-ScPTy-fWSa3ff6_A?view_as=subscriber'),
                ),
                ]),
            ),
            ),
        ]),
        ),
    ),
    );
        }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;go to the terminal and type&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$flutter run -d chrome
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Voila your Single Page Website is Ready on the web!!! This can also be run as a standalone app on your android phone or IOS device !!&lt;/p&gt;

&lt;p&gt;To do so change the branch to stable by using the command&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$flutter channel stable
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Anyways Lets get to hosting our Website on the web&lt;/p&gt;

&lt;h1&gt;
  
  
  Adding Project to Github
&lt;/h1&gt;

&lt;p&gt;Go to github and create a new repository&lt;/p&gt;

&lt;p&gt;Add the name&lt;/p&gt;

&lt;p&gt;And a short description&lt;/p&gt;

&lt;p&gt;Leave the intitalize a Readme as unticked and say create repository&lt;/p&gt;

&lt;p&gt;Now open the command Line and&lt;/p&gt;

&lt;p&gt;Run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$git init

$git add .

$git commit -m "First commit"

$git remote add "URL of repo"

$git push -u origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now go to settings and enable the githubpages option&lt;/p&gt;

&lt;p&gt;Then Run the following Commands in your flutter project&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$flutter pub global activate peanut

$flutter pub global run  peanut

$git push origin --set-upstream gh-pages
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally you are done !!!!&lt;/p&gt;

&lt;p&gt;The page is now hosted on github as the centralised version !!&lt;/p&gt;

&lt;p&gt;Fell free to share this url with your friends family and employer's&lt;/p&gt;

&lt;p&gt;Hoorah Lets Decentralize It&lt;/p&gt;

&lt;h1&gt;
  
  
  Decentralization
&lt;/h1&gt;

&lt;h2&gt;
  
  
  What is IPFS ??
&lt;/h2&gt;

&lt;p&gt;A peer-to-peer hypermedia protocol&lt;br&gt;
designed to make the web faster, safer, and more open.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://ipfs.io/" rel="noopener noreferrer"&gt;IPFS&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is fleek ??
&lt;/h2&gt;

&lt;p&gt;On IPFS , Fleek provides a beautiful way to integrate with Git so that whenever we make changes to our branches it can be directly&lt;br&gt;
updated on Fleek!!!!&lt;/p&gt;

&lt;h3&gt;
  
  
  Hosting on Fleek
&lt;/h3&gt;

&lt;p&gt;Click on the link below to know more here we can directly login with our git account and host our website&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/login?client_id=1cfc1c05c0a7ac516723&amp;amp;return_to=%2Flogin%2Foauth%2Fauthorize%3Fclient_id%3D1cfc1c05c0a7ac516723%26redirect_uri%3Dhttps%253A%252F%252Fauth.fleek.co%252Fgithub%252Fcallback%253Fstate%253D%252Finteraction%252FZHuHDYZi80oBowaz9y03R%252Fgithub%252Fcallback%26response_type%3Dcode%26scope%3Duser%253Aemail" rel="noopener noreferrer"&gt;Fleek&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Different CMS to host sites on fleek Link Below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.fleek.co/" rel="noopener noreferrer"&gt;Fleek Blog posts&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Find Me !!
&lt;/h1&gt;

&lt;p&gt;I have added more pages on for my porfolio such as an animated bottom navigation bar with custom images , a blog section and a short demo of my hackathon projects which will be added soon &lt;/p&gt;

&lt;p&gt;Find me on the web below !!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://haripranav.github.io/portfolio/#/" rel="noopener noreferrer"&gt;My Portfolio&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://hp.on.fleek.co/#/" rel="noopener noreferrer"&gt;My Decentralized Portfolio&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ipfs</category>
      <category>portfolio</category>
      <category>blockchain</category>
      <category>github</category>
    </item>
  </channel>
</rss>
