<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sunish Surendran K </title>
    <description>The latest articles on DEV Community by Sunish Surendran K  (@sunishsurendrank).</description>
    <link>https://dev.to/sunishsurendrank</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sunishsurendrank"/>
    <language>en</language>
    <item>
      <title>Control your playbook Task execution</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Tue, 12 Sep 2023 08:43:57 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/control-your-playbook-task-execution-4no4</link>
      <guid>https://dev.to/sunishsurendrank/control-your-playbook-task-execution-4no4</guid>
      <description>&lt;p&gt;In Ansible, fork, serial, throttle, and async are configuration parameters that control the concurrency and parallelism of task execution&lt;/p&gt;

&lt;h1&gt;
  
  
  Fork
&lt;/h1&gt;

&lt;p&gt;It specifies the maximum number of parallel connections that Ansible will use when executing tasks. Increasing the fork value allows you to parallelize task execution across multiple hosts.&lt;br&gt;
For example, setting fork: 3 means Ansible will execute tasks on up to 3 hosts simultaneously as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y9WtvPl5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1z8qjgd0dtl7wi6s6v75.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y9WtvPl5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1z8qjgd0dtl7wi6s6v75.gif" alt="Ansible Fork" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Serial
&lt;/h1&gt;

&lt;p&gt;The serial parameter in an Ansible playbook allows you to control the number of hosts that are targeted at a time during playbook execution. It's useful for scenarios where you want to limit the rate of change or avoid overloading hosts.&lt;/p&gt;

&lt;p&gt;For instance, if you set serial: 3, Ansible will target and execute tasks on three hosts at a time, waiting for them to complete before moving on to the next set of hosts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zLnUUX1d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y8q2z271j0ldt3nlp5fn.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zLnUUX1d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y8q2z271j0ldt3nlp5fn.gif" alt="Ansible Serial" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Throttle
&lt;/h1&gt;

&lt;p&gt;The throttle parameter in an Ansible playbook allows you to control the rate at which tasks are executed. In below example for Task 2 only one machine is executed at a time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8_wfdLVs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wdl46i64gm4cty6xmz9e.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8_wfdLVs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wdl46i64gm4cty6xmz9e.gif" alt="Ansible Throttle" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Async
&lt;/h1&gt;

&lt;p&gt;The async parameter in an Ansible task allows you to run that task asynchronously. It means Ansible will start the task against specific set of machines and immediately move on to the next set of machines without waiting for first set to complete.&lt;/p&gt;

&lt;p&gt;You can also use the poll parameter in conjunction with async to specify how often Ansible should check the status of the asynchronous task. In the below example poll is provided as two, so every two second the ansible will check the status of the task executed in the machines&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dUHr3FP1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j93l09e2oroblyj51iw9.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dUHr3FP1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j93l09e2oroblyj51iw9.gif" alt="Ansible async" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I end this blog with a question - If you want to achieve a scenario where a machine jumps to Task 2 as soon as its Task 1 is completed, while other machines are still executing Task 1.How you structure your playbook to achieve this scenario? Please comment if you know the answer.&lt;/p&gt;

</description>
      <category>ansible</category>
      <category>redhat</category>
      <category>automation</category>
      <category>infrastructureascode</category>
    </item>
    <item>
      <title>Creating Python Packages</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sun, 25 Jun 2023 14:16:08 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/create-python-packages-54nj</link>
      <guid>https://dev.to/sunishsurendrank/create-python-packages-54nj</guid>
      <description>&lt;h1&gt;
  
  
  TLDR
&lt;/h1&gt;

&lt;p&gt;Prior to delving straight into the creation of Python packages, let's first grasp the concept of &lt;strong&gt;Modular programming&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EiuT0f4J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0e85yn6pmyb4r5animl7.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EiuT0f4J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0e85yn6pmyb4r5animl7.JPG" alt="caveman with modular programing" width="449" height="468"&gt;&lt;/a&gt;&lt;br&gt;
Modular programming refers to the process of breaking a large, unwieldy programming task into separate, smaller, more manageable subtasks or modules. Individual modules can then be cobbled together like building blocks to create a larger application.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So, what are the example for modular programing in python?&lt;/strong&gt;&lt;br&gt;
Functions, modules and packages are all constructs in Python that promote code modularization.&lt;/p&gt;

&lt;p&gt;NICE!!&lt;/p&gt;

&lt;p&gt;You may already know what is a function in python, but did you get confused thinking what is the difference between module and package in python?&lt;/p&gt;

&lt;p&gt;Suppose you have developed a very large application that includes many modules. As the number of modules grows, it becomes difficult to keep track of them all if they are dumped into one location. This is particularly so if they have similar names or functionality. You might wish for a means of grouping and organizing them.&lt;/p&gt;

&lt;p&gt;In this case Packages allow for a hierarchical structuring of the module namespace using dot notation. &lt;/p&gt;

&lt;p&gt;OK, that great! Let's think we have some modules available; Now let's see how we can create a package out of it!&lt;/p&gt;

&lt;p&gt;Python packages are created using python packaging tools like setuptools, Flit, PDM, Poetry etc..&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2B2jREPQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/osbpb1yar1vgrevkjbvh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2B2jREPQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/osbpb1yar1vgrevkjbvh.jpg" alt="Python Package Tool" width="757" height="246"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Python setuptools is a widely used tool for packaging Python projects. Today our plan is to create package using setuptools.&lt;/p&gt;
&lt;h3&gt;
  
  
  Step 1 : Setup your environment
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Upgrade pip tool
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m pip install --upgrade pip
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Upgrade Build
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m pip install --upgrade build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;This installs following packages in your machine build-0.10.0, packaging-23.1, pyproject_hooks-1.0.0, tomli-2.0.1&lt;/p&gt;

&lt;p&gt;Build package is used to build the python package&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Install setuptools
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install setuptools
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;you can read about setuptools in the link &lt;a href="https://pypi.org/project/setuptools/"&gt;https://pypi.org/project/setuptools/&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Install twine tool
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install twine
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Step 2 : Create folder structure as shown below
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fqWa5qga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f12vff0yxyqjhbox8ypu.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fqWa5qga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f12vff0yxyqjhbox8ypu.JPG" alt="Image description" width="202" height="135"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here I created a master folder call sunish_package under it another folder level src/sunish_package and test&lt;/p&gt;

&lt;p&gt;Also you can I created files called LICENSE, ReadME and pyproject.toml&lt;/p&gt;
&lt;h3&gt;
  
  
  Step 3: Add modules
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c-hDhhJ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i1wfyrec8cbswpme2noo.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c-hDhhJ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i1wfyrec8cbswpme2noo.JPG" alt="Image description" width="269" height="179"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under the folder src/sunish_package i created two files with name &lt;strong&gt;init&lt;/strong&gt;.py and mdoule_1.py&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;init&lt;/strong&gt;.py is the Package Initialization script, it can be empty&lt;/li&gt;
&lt;li&gt;module_1.py is our module with test code
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def test_function():
    print("Hello World")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Step 4: Update pyproject.toml file
&lt;/h2&gt;

&lt;p&gt;In older version of setuptool the user need to create  setup.py script in the root location of the package which contains all the configuration for the package, but starting from setuptools version 58.0.0, it introduced support for reading package metadata and build configurations from the pyproject.toml file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[build-system]
requires = ["setuptools&amp;gt;=61.0"]
build-backend = "setuptools.build_meta"

[project]
name = "sunish_package"
version = "0.0.1"
authors = [
  { name="Example Author", email="author@example.com" },
]
description = "A small example package"
readme = "README.md"
requires-python = "&amp;gt;=3.7"
classifiers = [
    "Programming Language :: Python :: 3",
    "License :: OSI Approved :: MIT License",
    "Operating System :: OS Independent",
]

[project.urls]
"Homepage" = "https://github.com/pypa/sampleproject"
"Bug Tracker" = "https://github.com/pypa/sampleproject/issues"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The important thing to note here is the name of the package, it should be pointing to the same folder name under src. Also you can see under [build-system] title I have filled requires section as setuptools. This indicate the python package need to be build using the setuptools.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: Update LICESNSE and ReadME.md
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;ReadME.md&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# About Package

This is an example package
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;LICESNSE&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Copyright (c) 2023 Sunish.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I prefer you to use correct License and update ReadME with your package information. Since I am creating a test package, here I am filling it with basic information.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 6 : Create Python Package
&lt;/h2&gt;

&lt;p&gt;Go to your root folder and run the below command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m build .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--17zQnnRV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mfcvouyrsnlmc4ugqezy.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--17zQnnRV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mfcvouyrsnlmc4ugqezy.JPG" alt="Image description" width="637" height="154"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This created two folders called dist and sunish_package.egg-info&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MjO2YfGM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wyg9zjqrdv0m1dlgajhy.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MjO2YfGM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wyg9zjqrdv0m1dlgajhy.JPG" alt="Image description" width="800" height="252"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Whl file is the file we want, the python packages are packaged in the forrmat called wheels (.whl)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dLxIVVgg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jllbf7da3btgb4kcsnoh.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dLxIVVgg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jllbf7da3btgb4kcsnoh.JPG" alt="Image description" width="710" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 7 : Import and test
&lt;/h2&gt;

&lt;p&gt;Go the dist folder and install the package using pip&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install .\sunish_package-0.0.1-py3-none-any.whl
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;you can check the package is listing in the pipe list with below command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip list -v
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now open a python console and import our new package&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from sunish_package import module_1
module_1.test_function()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MjtPHgqQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0anspvnr4mxz8877g126.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MjtPHgqQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0anspvnr4mxz8877g126.JPG" alt="Image description" width="792" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SUCCESS!! it prints "hello world" as expected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 8 : Push the package to internal artifactory python repository
&lt;/h2&gt;

&lt;p&gt;In order to push the package to our internal artifactory we make use of the tool Twine.&lt;/p&gt;

&lt;p&gt;Create the .pypirc file in the dist folder and update the artifactory url , username and password&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[distutils]

index-servers = local

[local]

repository: http://localhost:8081/artifactory/api/pypi/pypi

username: &amp;lt;user_name&amp;gt;

password: &amp;lt;password&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run the below command to push the package to the artifactory repo&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;twine upload -r local &amp;lt;PATH_TO_THE_FILES&amp;gt; --config-file ~/.pypirc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>python</category>
      <category>programming</category>
      <category>package</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Behind the Automation of Jurassic Park</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Fri, 19 May 2023 21:19:17 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/behind-the-automation-of-jurassic-park-42d8</link>
      <guid>https://dev.to/sunishsurendrank/behind-the-automation-of-jurassic-park-42d8</guid>
      <description>&lt;p&gt;The iconic Jurassic Park movie, released in 1993, captivated audiences with its mesmerizing depiction of a fully automated park powered by computer programming. While we marvelled at the breathtaking scenes, have you ever wondered which programming language was utilised to bring the park's automation to life?"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wVY6QcUJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ipkjgzgosa9u2w0s9se9.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wVY6QcUJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ipkjgzgosa9u2w0s9se9.jpeg" alt="Image description" width="452" height="678"&gt;&lt;/a&gt;&lt;br&gt;
While it is not explicitly stated in the Jurassic Park movie that the control system language used in the park is Pascal, there are indeed references and similarities to Pascal in the film.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4jzlI5ew--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ephjbucxj3h4u9bkzigy.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4jzlI5ew--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ephjbucxj3h4u9bkzigy.jpeg" alt="Image description" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the movie, the character Dennis Nedry, portrayed as a skilled computer programmer, is shown writing code for the park's control system. The code snippets shown on screen display Pascal-like syntax and structure, suggesting a resemblance to the Pascal programming language.&lt;/p&gt;

&lt;p&gt;Additionally, the character Ray Arnold, played by Samuel L. Jackson, mentions "Pascal routines" when discussing the park's control system. This further suggests that Pascal may have been an inspiration for the fictional language depicted in the movie.&lt;/p&gt;

&lt;p&gt;However, it's important to note that the language portrayed in the film as "Nedry's Code" or "JP-CL" is a fictional representation and does not correspond directly to the real Pascal programming language. The movie takes artistic liberties and simplifies the technical aspects for storytelling purposes.&lt;/p&gt;

</description>
      <category>movie</category>
      <category>code</category>
      <category>pascal</category>
    </item>
    <item>
      <title>A dive into Ansible Modules and Module Utilities</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Tue, 09 May 2023 15:45:17 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/a-dive-into-ansible-modules-and-module-utilities-5ekh</link>
      <guid>https://dev.to/sunishsurendrank/a-dive-into-ansible-modules-and-module-utilities-5ekh</guid>
      <description>&lt;p&gt;I have been tasked with developing a custom script to be executed on a target machine via Ansible. While aware of Ansible's capability to integrate custom scripts, I am uncertain whether such integrations are accomplished via &lt;strong&gt;Ansible Modules or Plugins&lt;/strong&gt; ?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wqc4e52vhygg9e68wbp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wqc4e52vhygg9e68wbp.png" alt="Anibsle module vs Ansible plugin"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After quick research in google I understood that &lt;strong&gt;Module&lt;/strong&gt; is designed to handle a specific task and provides a standardized interface for Ansible to communicate with the remote systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Plugins&lt;/strong&gt;, on the other hand, are used to extend the functionality of Ansible itself, rather than defining the desired state of a system. Plugins can be used to add new functionality to Ansible, such as custom inventory scripts, custom connection types, and custom callback plugins. Plugins are generally written in Python and can be loaded by Ansible at runtime.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foew4ljpmjkikee9av2t5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foew4ljpmjkikee9av2t5.png" alt="ansible meme"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So now I know I need to create module not plugin, now the question is in which language I need to write the module &lt;strong&gt;Python or PowerShell&lt;/strong&gt;? It's depending upon the machine you are targeting. If you are planning to run the module against a Linux machine, then python is the best choice. In my case I am targeting a windows machine, so I need write down a windows module.&lt;/p&gt;

&lt;p&gt;Suddenly a question popped up on my mind why I can't write a python module and use it with windows and Linux, that is the best way right? &lt;/p&gt;

&lt;p&gt;It is not possible. Ansible team explained the reason in below link. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.ansible.com/ansible/latest/os_guide/windows_faq.html#can-i-run-python-modules-on-windows-hosts" rel="noopener noreferrer"&gt;https://docs.ansible.com/ansible/latest/os_guide/windows_faq.html#can-i-run-python-modules-on-windows-hosts&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fno6aiil2ve2f5yd219ws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fno6aiil2ve2f5yd219ws.png" alt="Ansible overview"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ok, then let's create ansible PowerShell module.&lt;/p&gt;

&lt;p&gt;When I read through the documentation I understood we should use ansible inbuild **Module_Utils **while creating custom ansible modules.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0nxu1dl7jfus6t48n2c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0nxu1dl7jfus6t48n2c.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why to use ansible Module_Utils?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Module_utils can be used by Ansible module developers to simplify their code and reduce duplication. By using module_utils functions, module developers can avoid writing boilerplate code and instead focus on implementing the core functionality of their module.&lt;/p&gt;

&lt;p&gt;For example, the ansible.module_utils.basic module provides functions for handling input and output, while the ansible.module_utils.urls module provides functions for making HTTP requests.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where I can find this ansible Module_Utils?&lt;/strong&gt;&lt;br&gt;
You can find it under the path where ansible is installed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm5scv83x3121qi1uj8ip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm5scv83x3121qi1uj8ip.png" alt="Ansible Module Utils"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to import Module_Utils to my custom module?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Like I shown below you can import Module_Utils for python and PowerShell module&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#!/usr/bin/python
from ansible.module_utils.basic import AnsibleModule


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#!powershell
#AnsibleRequires -CSharpUtil Ansible.Basic


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl309ryyqs4eet579j1d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl309ryyqs4eet579j1d.png" alt="Ansible python and powershell module"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#python
module = AnsibleModule(argument_spec=spec)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#powershell
$module = [Ansible.Basic.AnsibleModule]::Create($args, $spec)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Note: spec parameter is a hashtable/dictionary that defines the argument specification for the module.&lt;/p&gt;

&lt;p&gt;In both Python and PowerShell ansible module you can see that I created an object called module for the class AnsibleModule.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;AnsibleModule class&lt;/strong&gt; is a core component of Ansible that provides a standard interface for writing Ansible modules.&lt;/p&gt;

&lt;p&gt;AnsibleModule class can be used to access the input parameters and the output results of the module. The object module created provides several methods and properties, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Params: A hashtable that contains the input parameters passed to the module.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;ExitJson($result): A method that exits the module and returns the output result in JSON format.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;FailJson($msg): A method that exits the module with an error message in JSON format.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Warn($msg): A method that writes a warning message to the console.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Fail($msg): A method that writes an error message to the console and exits the module.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Using all these methods I created a nice PowerShell module for ansible.&lt;br&gt;
I don't want to explain how I written the module because it is already mentioned in the ansible documentation &lt;a href="https://docs.ansible.com/ansible/latest/dev_guide/developing_modules_general_windows.html" rel="noopener noreferrer"&gt;https://docs.ansible.com/ansible/latest/dev_guide/developing_modules_general_windows.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So now start writing your own ansible modules. :) enjoy!!&lt;/p&gt;

</description>
      <category>ansible</category>
      <category>module</category>
      <category>programming</category>
      <category>python</category>
    </item>
    <item>
      <title>Memory management in Python</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Thu, 09 Mar 2023 20:16:05 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/memory-management-in-python-4n3a</link>
      <guid>https://dev.to/sunishsurendrank/memory-management-in-python-4n3a</guid>
      <description>&lt;p&gt;Python's memory management is efficient and transparent, and developers do not need to worry about managing memory manually.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eiLBwavG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i36y88dh0vh0cy877264.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eiLBwavG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i36y88dh0vh0cy877264.png" alt="Image description" width="616" height="716"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In python, the Primitive variable objects like &lt;strong&gt;integers, floating-point numbers, boolean values are typically stored in the stack. Also Function calls and local variables, function call stack and exception stack&lt;/strong&gt; are stored in the stack.&lt;/p&gt;

&lt;p&gt;Storing these objects on the stack allows for efficient memory management. It is worth noting, however, that not all primitive values are stored on the stack. For example, large arrays or other data structures that exceed a certain size may be stored on the heap instead. Additionally, Python's memory management is dynamic and flexible, so the actual storage location of an object can depend on a variety of factors, including the object's size, lifetime, and usage patterns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Objects, lists, dictionaries and other data structures&lt;/strong&gt; are stored on the heap. Now you may get confused thinking the primitive data types also is an object, but it is stored in heap. Then what type of objects stored in Heap?&lt;/p&gt;

&lt;p&gt;Objects created from a class are typically stored on the heap, not the stack. This is because class instances are typically long-lived objects that can exist beyond the scope of a single function call.&lt;/p&gt;

&lt;p&gt;When you create an instance of a class using the class constructor (i.e. by calling the class with arguments), a new object is created on the heap. The class instance contains all the data members defined in the class and any additional data members that are added dynamically at runtime. The memory required for the object is dynamically allocated and managed by the Python runtime.&lt;/p&gt;

&lt;p&gt;For example, consider the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Person:
    def __init__(self, name, age):
        self.name = name
        self.age = age

person1 = Person("John", 30)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, person1 is an object of the Person class that is created on the heap when the Person constructor is called. &lt;/p&gt;

&lt;h2&gt;
  
  
  Let's see how we can create a Stack Overflow and Heap Overflow in Python
&lt;/h2&gt;

&lt;p&gt;Here is an example of a recursive function that could cause a stack overflow:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def recursive_function(x):
    if x == 0:
        return 0
    else:
        return x + recursive_function(x-1)
result = recursive_function(10000)
print(result)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, the recursive_function calls itself recursively with smaller values of x until it reaches the base case where x == 0. However, if x is too large, the function will create a large number of nested function calls on the call stack, eventually causing a stack overflow.&lt;/p&gt;

&lt;p&gt;Allocate a large amount of memory using a list&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;large_list = [0] * 1000000000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, we create a list with a billion elements by using the multiplication operator to repeat the value 0 a billion times. Since each element in the list takes up memory, this can cause a heap overflow if the system does not have enough memory to allocate a billion elements.&lt;/p&gt;

&lt;p&gt;Python also includes a &lt;strong&gt;garbage collector&lt;/strong&gt; that automatically frees up memory that is no longer being used by the program. This helps to prevent memory leaks and makes it easier to write Python programs that don't have to worry about managing memory manually.&lt;/p&gt;

&lt;p&gt;Python's garbage collector runs automatically in the background, and most of the time, you don't need to worry about it. However, in some cases, it can be helpful to manually trigger the garbage collector using the &lt;strong&gt;gc.collect()&lt;/strong&gt; function.&lt;/p&gt;

</description>
      <category>python</category>
      <category>memory</category>
    </item>
    <item>
      <title>Stack and Heap memory</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Tue, 07 Mar 2023 19:33:16 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/stack-and-heap-memory-4l1g</link>
      <guid>https://dev.to/sunishsurendrank/stack-and-heap-memory-4l1g</guid>
      <description>&lt;p&gt;Before Jumping to know about Stack and Heap memory let's do a time travel to past.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4dCj4ucU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://media.tenor.com/g2r2v5AvZN8AAAAd/time-travel.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4dCj4ucU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://media.tenor.com/g2r2v5AvZN8AAAAd/time-travel.gif" alt="Past" width="640" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  The Past
&lt;/h1&gt;

&lt;p&gt;The concept of heap and stack memory is an important part of computer programming and memory management. The origins of these concepts can be traced back to the early days of computer science and programming languages.&lt;/p&gt;

&lt;p&gt;One of the earliest references to the concept of a "stack" in computer programming is attributed to John von Neumann, who developed the concept of a "last-in, first-out" (LIFO) data structure in the 1940s. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bbZEt4td--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xdhmv5rrtrrhrbqm7nt6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bbZEt4td--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xdhmv5rrtrrhrbqm7nt6.png" alt="John von Neumann" width="665" height="452"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;John von Neumann&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The idea of using a stack to manage memory in computer programs was later popularized by the ALGOL programming language in the 1950s.&lt;/p&gt;

&lt;p&gt;The concept of heap memory can also be traced back to the early days of computer science. The term "heap" was first used in reference to a data structure in the 1960s, and was popularized by the programming language Lisp. The idea of dynamic memory allocation, which allows programs to allocate memory at runtime, was also introduced around this time.&lt;/p&gt;

&lt;p&gt;Overall, the concepts of heap and stack memory have evolved over time as computer hardware and programming languages have advanced. Today, they remain an essential part of memory management in modern programming languages and operating systems.&lt;/p&gt;

&lt;p&gt;Also, there are some programming languages that are designed to work without heap and stack memory. For example, the "Forth" programming language, which was first developed in the 1970s, uses a unique memory management system that does not rely on heap or stack memory. Instead, Forth programs use a single, contiguous block of memory to store both program code and data.&lt;/p&gt;

&lt;h1&gt;
  
  
  Now let's come back &amp;amp; check what is this Stack and Heap
&lt;/h1&gt;

&lt;p&gt;When RAM is manufactured, it is initially unallocated, which means that it does not have any data stored on it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vMa-_FuS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hyvb31bcpdxs75nptyok.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vMa-_FuS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hyvb31bcpdxs75nptyok.gif" alt="Image description" width="498" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, the operating system of a computer, such as Windows or Linux, is responsible for managing the allocation of memory.&lt;/p&gt;

&lt;p&gt;When a program is executed, the operating system allocates memory to the program, which includes both stack and heap memory. The stack memory is typically allocated at the top of the program's address space, while the heap is allocated at the bottom.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ymyOiLKd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/300uqfdbe0v3kngfbwry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ymyOiLKd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/300uqfdbe0v3kngfbwry.png" alt="Stack and Heap memory" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The size of the stack is usually predetermined at compile time, and the operating system sets aside a fixed amount of memory for the stack. When a function is called, the processor allocates a block of memory on the stack for the function's stack frame, which includes the function's parameters, local variables, and return address. As the function executes, it pushes and pops data on and off the stack.&lt;/p&gt;

&lt;p&gt;If we try to store dynamically generated data on the stack, it can cause a stack overflow because the stack has a limited size. The amount of memory required for dynamically generated data may be unpredictable and may exceed the available stack space, leading to unpredictable behavior and program crashes.&lt;/p&gt;

&lt;p&gt;Therefore, dynamic data is usually stored in the heap, which is a larger pool of memory that can be allocated and deallocated at runtime. The heap allows us to allocate memory for dynamically generated data as needed, and to release that memory when it is no longer needed, without worrying about stack limitations.&lt;/p&gt;

&lt;p&gt;I hope this gave you an idea about stack and heap memory. If you think there is something more, please make a comment.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Hey wait..One more interesting thing to share, the cover page for this blog is generated by DALL-E " A deep learning models developed by OpenAI to generate digital images from natural language descriptions"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wuyxb4kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yn7d2p5abgy7gp4zaoe1.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wuyxb4kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yn7d2p5abgy7gp4zaoe1.JPG" alt="Image description" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>memory</category>
      <category>computerscience</category>
      <category>programming</category>
      <category>ram</category>
    </item>
    <item>
      <title>Quality Metrics</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sun, 05 Mar 2023 08:16:48 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/quality-metrics-6in</link>
      <guid>https://dev.to/sunishsurendrank/quality-metrics-6in</guid>
      <description>&lt;h2&gt;
  
  
  Code Coverage
&lt;/h2&gt;

&lt;p&gt;The percentage of code that is covered by (automated) unit tests.&lt;/p&gt;

&lt;h2&gt;
  
  
  Abstract Interpretation
&lt;/h2&gt;

&lt;p&gt;Let's explain the abstract with an example.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if (obj == null) {
    obj = getObject();
}
obj.method();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here in the above example there is one problem abstract interpretation may detect that is the obj variable may have the value null even after line 2. &lt;/p&gt;

&lt;h2&gt;
  
  
  Cyclomatic Complexity
&lt;/h2&gt;

&lt;p&gt;Programs with a lot of conditional statement or loops are more complex. This means maintainability of the program is hurt, since it is hard to understand for new developers working on it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Compiler Warnings
&lt;/h2&gt;

&lt;p&gt;Compiler warnings are potential errors in the source code that are discovered during compilation. These include syntactic errors for which the semantics are easily misinterpreted, portability issues and type errors.&lt;/p&gt;

&lt;h2&gt;
  
  
  Coding Standards
&lt;/h2&gt;

&lt;p&gt;Coding standards provide rules for software developers to follow while writing code. The most important reason for coding standards is maintainability; a clear set of rules makes it easier for programmers to understand the meaning of code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Code Duplication
&lt;/h2&gt;

&lt;p&gt;Code Duplication is a software metric that indicates the amount of source code that occurs more than once in a program.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fan Out
&lt;/h2&gt;

&lt;p&gt;The fan out metric indicates how many different modules are used by a certain module. &lt;/p&gt;

&lt;p&gt;For Python, the number of modules mentioned in the import statement are counted. from-import counts as 1.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Thoughts on :Nginx</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sun, 14 Nov 2021 15:41:36 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/thoughts-on-nginx-3ok5</link>
      <guid>https://dev.to/sunishsurendrank/thoughts-on-nginx-3ok5</guid>
      <description>&lt;p&gt;Fantastic way to wrangle some good software product is to experience it firsthand.So lets try Nginx today.And i have a weired hobby of connecting my fav software products with comic characters.&lt;/p&gt;

&lt;p&gt;I have choosen Green Latern as Nginx &lt;br&gt;
&lt;em&gt;"All the powers that the Power Ring gives"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fLBkRYo---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xyvb3k7qaxlsfia9gg9z.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fLBkRYo---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xyvb3k7qaxlsfia9gg9z.gif" alt="Image description" width="400" height="230"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Nginx [engine x] is a free and opensource high performance web-server that can be used as &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reverse proxy&lt;/li&gt;
&lt;li&gt;Load balancer&lt;/li&gt;
&lt;li&gt;SMTP proxy&lt;/li&gt;
&lt;li&gt;TCP/UDP proxy server. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There is a commercial edition available for Nginx called as Nginx Plus.&lt;/p&gt;
&lt;h2&gt;
  
  
  Source Code:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rTfZb71M--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ep5cfgm7z278055hyq2c.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rTfZb71M--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ep5cfgm7z278055hyq2c.PNG" alt="Image description" width="647" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Nginx source code is published on mercurial &lt;a href="http://hg.nginx.org/nginx/file/tip"&gt;http://hg.nginx.org/nginx/file/tip&lt;/a&gt; , May be its new info for your, Mercurial is a free multiplatform distributed version control system. Mercurial is used via the command line; the program file itself is called hg.So like git we can download the nginx code using below command line.&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;hg clone http://hg.nginx.org/nginx&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;If you want to build the nginx from the source code please follow the link &lt;a href="http://nginx.org/en/docs/howto_build_on_win32.html"&gt;http://nginx.org/en/docs/howto_build_on_win32.html&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  NGINX Configuration file
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HsXTApEQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ztjhatnc467han9qtnw3.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HsXTApEQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ztjhatnc467han9qtnw3.PNG" alt="Image description" width="457" height="701"&gt;&lt;/a&gt;&lt;br&gt;
NGINX can be configured with nginx.conf file, By default this file is named as nginx.conf and placed under /usr/local/nginx/conf, /etc/nginx, or /usr/local/etc/nginx.&lt;/p&gt;

&lt;p&gt;The Nginx configuration file consist of Directives and Contexts &lt;/p&gt;

&lt;h2&gt;
  
  
  What are Directives and Contexts?
&lt;/h2&gt;

&lt;p&gt;NGINX configuration consists of key-value pairs called directives. Directives decides which configuration to apply.They can be organized and grouped into Blocks known as Contexts.&lt;/p&gt;

&lt;p&gt;Contexts are tree like structures that can be nested within one another and Directives can only been used within Contexts.&lt;/p&gt;

&lt;p&gt;Directives and Context looks like below:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1DcVOAyr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9j4ldxjyzfeqxyn0t231.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1DcVOAyr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9j4ldxjyzfeqxyn0t231.PNG" alt="Image description" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Nginx Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--i81CGg16--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8s47uh7okcdzvtwqx0hc.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i81CGg16--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8s47uh7okcdzvtwqx0hc.PNG" alt="Image description" width="800" height="526"&gt;&lt;/a&gt;&lt;br&gt;
Nginx has one master process and several worker processes.&lt;/p&gt;

&lt;p&gt;The main purpose of the master process is to read and evaluate configuration, and maintain worker processes. &lt;/p&gt;

&lt;p&gt;Worker processes do actual processing of requests. nginx employs event-based model and OS-dependent mechanisms to efficiently distribute requests among worker processes. &lt;/p&gt;

&lt;p&gt;The number of worker processes is defined in the configuration file and may be fixed for a given configuration. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Jvv1Hkgo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bzsojpf3envtpujswu64.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Jvv1Hkgo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bzsojpf3envtpujswu64.PNG" alt="Image description" width="800" height="304"&gt;&lt;/a&gt;&lt;br&gt;
IF we mention auto like below the number of worker processes automatically adjusted to the number of available CPU cores&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f9_j36Kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rgh96lgpr4dip6a5nm6n.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f9_j36Kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rgh96lgpr4dip6a5nm6n.PNG" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each worker process can open a by default 512 connections&lt;/p&gt;

&lt;p&gt;As previously mentioned, nginx doesn't spawn a process or thread for every connection. Instead, worker processes accept new requests from a shared "listen" socket and execute a highly efficient run-loop inside each worker to process thousands of connections per worker&lt;/p&gt;

&lt;h2&gt;
  
  
  Extending Nginx:
&lt;/h2&gt;

&lt;p&gt;Nginx can be extended by using thrid-party modules.There is a large ecosystem of third‑party modules, ranging from language interpreters to security solutions.&lt;/p&gt;

&lt;p&gt;Have a look at the below nginx blog about compling dynamic modules for nginx&lt;br&gt;
&lt;a href="https://www.nginx.com/blog/compiling-dynamic-modules-nginx-plus/"&gt;https://www.nginx.com/blog/compiling-dynamic-modules-nginx-plus/&lt;/a&gt; &lt;/p&gt;

</description>
    </item>
    <item>
      <title>Understanding GitHub Copilot(AI Pair Programmer)</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Thu, 01 Jul 2021 12:03:12 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/understanding-github-copilot-ai-pair-programmer-5g69</link>
      <guid>https://dev.to/sunishsurendrank/understanding-github-copilot-ai-pair-programmer-5g69</guid>
      <description>&lt;p&gt;GitHub Copilot is an AI pair programmer which suggests line completions and entire function bodies as you type.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;"Hmm that's Cool right!! But i think there is lot of kerfuffle around it."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Qxk8V6mL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/khl1bwo57kt6o6az4s1f.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Qxk8V6mL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/khl1bwo57kt6o6az4s1f.PNG" alt="Alt Text" width="542" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;GitHub Copilot is powered by the &lt;strong&gt;OpenAI Codex AI system&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;"You might me now thinking what is this OPenAI Codex AI system."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;OpenAI is an AI research and deployment company.The investors include Microsoft, Reid Hoffman’s charitable foundation, and Khosla Ventures.Goal of the company is  promoting and developing friendly AI in a way that benefits humanity as a whole.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IupJi8Mi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jpbm539s4ann8txxpgp2.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IupJi8Mi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jpbm539s4ann8txxpgp2.PNG" alt="Alt Text" width="557" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Recently OpenAI transitioned from non-profit to for-profit.&lt;br&gt;
They have developed so many products.They have papers submitted on topic generative pre-training of a language model.This Generative pre-training commonly known by its abbreviated form &lt;strong&gt;GPT&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;So now coming back to our GitHub Copilot, it is built on a new algorithm called &lt;strong&gt;OpenAI Codex, which OpenAI CTO Greg Brockman describes as a descendant of GPT-3&lt;/strong&gt;.While GPT-3 generates English, OpenAI Codex generates code.&lt;/p&gt;

&lt;p&gt;It works best with Python, JavaScript, TypeScript, Ruby, and Go, according to a blog post from GitHub.&lt;/p&gt;

&lt;p&gt;Since now GitHub is with Microsoft , Microsoft can easly push the OpenAI elements to GitHub.For now, Microsoft is only offering the service that knows about code stored in public repositories.Future we can expect that for enterprise customers too.&lt;/p&gt;

&lt;p&gt;GitHub have a nice &lt;a href="https://copilot.github.com/"&gt;documentation&lt;/a&gt; written for the Copilot.Have a look.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;"Now the question will be how to use it"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;It  is available today as a Visual Studio Code extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z6Ox7YtX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ltnfpmzg8qdprfaggh10.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z6Ox7YtX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ltnfpmzg8qdprfaggh10.PNG" alt="Alt Text" width="530" height="283"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But currently we need to signup and access is limited to a small group of testers during the technical preview.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>codenewbie</category>
      <category>github</category>
      <category>news</category>
    </item>
    <item>
      <title>WSL,VSCode Terminal: "Coddled and Cosseted"</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sun, 09 May 2021 12:33:20 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/wsl-vscode-terminal-coddled-and-cosseted-aio</link>
      <guid>https://dev.to/sunishsurendrank/wsl-vscode-terminal-coddled-and-cosseted-aio</guid>
      <description>&lt;p&gt;&lt;strong&gt;FLAG:&lt;/strong&gt;   &lt;em&gt;"If you have more time as a developer to improve your Terminal look and also if you're a Windows user, this blog can help you!"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here we try out some fancy stuff for the WSL terminal.&lt;/p&gt;

&lt;p&gt;The plan is to use WSL via windows Terminal. For this first install WSL and Windows Terminal on your machine.&lt;/p&gt;

&lt;p&gt;Go through the below &lt;strong&gt;6 steps&lt;/strong&gt; and have the awesome terminal on your machine.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;STEP 1&lt;/strong&gt; : Install zsh in WSL
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apt-get install zsh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bxlYko1s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/utiiu3xpi0a7v0p98c2u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bxlYko1s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/utiiu3xpi0a7v0p98c2u.png" alt="Alt Text" width="800" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;STEP 2&lt;/strong&gt; : Install "Oh My Zsh"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;visit the website &lt;a href="https://ohmyz.sh/#install"&gt;https://ohmyz.sh/#install&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sh -c "$(curl -fsSL https://raw.github.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bGXIKUPV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jz1u854gfjo8ttmkfxjl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bGXIKUPV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jz1u854gfjo8ttmkfxjl.png" alt="Alt Text" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;STEP 3&lt;/strong&gt; : Install PowerLevel10k
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/romkatv/powerlevel10k.git $ZSH_CUSTOM/themes/powerlevel10k
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;STEP 4&lt;/strong&gt; : Install Font FiraCode NF (We need this font then only the PowerLevel10k theme looks fine.)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Download the Font from the &lt;a href="https://github.com/ryanoasis/nerd-fonts/raw/master/patched-fonts/FiraCode/Medium/complete/Fira%20Code%20Medium%20Nerd%20Font%20Complete%20Windows%20Compatible.ttf"&gt;link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After downloading the ttf file into your machine, just double-click the ttf file. The file will open, then press the install button at the top. This will install the font on your windows machine.&lt;/p&gt;

&lt;p&gt;Now we need to map the font to the WSL Terminal, for that go to the settings of the terminal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yd3nRFC7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ndivchkqg7p159mmhdj.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yd3nRFC7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ndivchkqg7p159mmhdj.JPG" alt="Alt Text" width="554" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the Open JSON file&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--e-SHp4Ez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4lfi7oobp9rtdr0qep9a.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--e-SHp4Ez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4lfi7oobp9rtdr0qep9a.JPG" alt="Alt Text" width="749" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The JOSN file looks like below one&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XLwuB_4D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ead11jhbznrw8ja3u8tt.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XLwuB_4D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ead11jhbznrw8ja3u8tt.JPG" alt="Alt Text" width="800" height="550"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Copy the below code and paste it in your JSON file.&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;NOTE: update the guid with the guid of your ubuntu instance&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;            {
                "fontFace": "FiraCode NF",
                "guid": "{YourUbuntuguid}",
                "hidden": false,
                "name": "Ubuntu",
                "snapOnInput": true,
                "source": "Windows.Terminal.Wsl",
                "useAcrylic": true
            },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You need to copy the code to the appropriate section of the JSON file as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fbLzlN5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hugfl8du4jhazxsa7vf6.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fbLzlN5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hugfl8du4jhazxsa7vf6.JPG" alt="Alt Text" width="800" height="789"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;STEP 5&lt;/strong&gt; : Configuring the Powerlevel10k &lt;/p&gt;

&lt;p&gt;Go to WSL and open the .zshrc file kept in your home directory.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vi ~/.zshrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add the below line and save the file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ZSH_THEME="powerlevel10k/powerlevel10k"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Reload the .zshrc file in your terminal&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;source ~/.zshrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then the configuration screen for the powerlevel10k will appear if not type the below command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;p10k configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Follow the instructions and set up the new look as below!!!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0ZbEnTqg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ohoqeu4lppkxs0qak9od.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0ZbEnTqg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ohoqeu4lppkxs0qak9od.JPG" alt="Alt Text" width="800" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MDExC7pj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2o4xc7xwnqz88jbdub68.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MDExC7pj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2o4xc7xwnqz88jbdub68.JPG" alt="Alt Text" width="800" height="276"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---O6Q6UXY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/smt3wk6j7ttg7arsfrmc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---O6Q6UXY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/smt3wk6j7ttg7arsfrmc.png" alt="Alt Text" width="498" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ohhhh!!! Wait!!&lt;/strong&gt; We have a few more work. Now we need our VScode terminal also looks cool.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;STEP 6&lt;/strong&gt; : Updating the VSCode terminal font&lt;/p&gt;

&lt;p&gt;Open VScode and go to its setting&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--o3D7JHM0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g712b1k1qmn8pnaxw8yc.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o3D7JHM0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g712b1k1qmn8pnaxw8yc.JPG" alt="Alt Text" width="318" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Search the below line&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Terminal integrated font family&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xwe4IRKR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oem2g06k54av6ywqrtg8.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xwe4IRKR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oem2g06k54av6ywqrtg8.JPG" alt="Alt Text" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;update the font name here also.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_wuH_gvM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zmo16klkf1ndgpjbhfxh.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_wuH_gvM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zmo16klkf1ndgpjbhfxh.JPG" alt="Alt Text" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Done, if you open the WSL terminal in vscode it looks cool!!.&lt;br&gt;
Enjoy!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>codenewbie</category>
      <category>computerscience</category>
      <category>linux</category>
    </item>
    <item>
      <title>Deploying Kubernetes Operators using Operator Life Cycle Manager</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sat, 20 Mar 2021 09:11:43 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/installing-kubernetes-operators-using-operator-life-cycle-manager-3i6d</link>
      <guid>https://dev.to/sunishsurendrank/installing-kubernetes-operators-using-operator-life-cycle-manager-3i6d</guid>
      <description>&lt;h1&gt;
  
  
  TL;DR
&lt;/h1&gt;

&lt;p&gt;Welcome to the world of Kubernetes Operators!!&lt;/p&gt;

&lt;p&gt;With the introduction of Custom Resource Definitions (CRDs) in version 1.7 Kubernetes is more extensible and developers are bringing more light toward the Operator world.Nowadays most of the applications are deployed as operators and they will have deep knowledge of how the system ought to behave, how to deploy it, and how to react if there are problems.&lt;/p&gt;

&lt;p&gt;So in order to make the development and maintenance of operator easier &lt;strong&gt;RedHat&lt;/strong&gt; introduced an Operator framework, it has SDK, olm(operator lifecycle manager), and operator registry.&lt;/p&gt;

&lt;p&gt;In this article, let's explore how we can deploy an operator to the Kubernetes cluster with the help of the Operator lifecycle manager.&lt;/p&gt;

&lt;h1&gt;
  
  
  Creating Kubernetes Cluster using &lt;a href="https://kind.sigs.k8s.io/" rel="noopener noreferrer"&gt;KIND&lt;/a&gt;
&lt;/h1&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"kind is a tool for running local Kubernetes clusters using Docker container nodes"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;KIND is spawned from a Kubernetes Special Interest Group and I can say confidently, it’s going to be a very useful tool.&lt;/p&gt;

&lt;p&gt;OK, for now just download the binaries of KIND and then run the below command in your machine. This will bring up a new Kubernetes cluster with name testcluster.&lt;/p&gt;

&lt;p&gt;Since this article is not about KIND I am not going to explain all pre-requisites to be installed to run KIND, you can refer to the KIND documentation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Kind create cluster –name testcluster –config=kindconfig.yaml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here I used a &lt;a href="https://github.com/sunishsurendrank/K8_infra/blob/main/kind/kindconfig.yaml" rel="noopener noreferrer"&gt;kindconfig.yaml&lt;/a&gt; because I wanted to map few extra ports to my base machine.(Optional)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs8s2rbfco9ss335wy9dl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs8s2rbfco9ss335wy9dl.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffeilv6hd7emziluznn02.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffeilv6hd7emziluznn02.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within minutes Kubernetes cluster is created and it is up and running.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft32s6jj5n2vjnjgnrgd2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft32s6jj5n2vjnjgnrgd2.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h1&gt;
  
  
  Deploying OLM into the newly created Cluster
&lt;/h1&gt;

&lt;p&gt;I have already mentioned about Operator framework by RedHat, now let's see it in detail.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"The Operator Framework is an open-source toolkit to manage Kubernetes native applications, called Operators, in an effective, automated, and scalable way."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Operator Framework consists of three major tools &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Operator-sdk&lt;/strong&gt; – For the development of the operators.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operator-Lifecycle-Manger(OLM)&lt;/strong&gt; – For managing the lifecycle of the operator.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operator-registry&lt;/strong&gt; – For storing the operator metadata. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;OLM won’t get deployed by default as part of the Kubernetes installation, we should install it separately.&lt;/p&gt;

&lt;p&gt;As of now, the latest version released is 0.17.0, by running the below command in the machine which is pointing to the newly created KIND cluster, OLM will get deployed to the cluster.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -sL https://github.com/operator-framework/operator-lifecycle-manager/releases/download/v0.17.0/install.sh | bash -s v0.17.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftv2ik58hk4svq4vqb1l4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftv2ik58hk4svq4vqb1l4.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we do "kubectl get namespace" we can see new two namespaces introduced in the cluster - olm and operators.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzctffi9chj91kabg9uv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzctffi9chj91kabg9uv.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
By listing the pods in the olm namespace we can see two major pods running in the namespace &lt;strong&gt;olm operator&lt;/strong&gt; and &lt;strong&gt;catalog operator&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17g6xpxg44bkbs60mnfz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17g6xpxg44bkbs60mnfz.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fthx36daz9x29v8piqdq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fthx36daz9x29v8piqdq1.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I know it is getting a little messy now. To install Operator we are installing olm and olm major components are Operators. Crazy right!&lt;/p&gt;

&lt;p&gt;So this means to install an operator olm is not mandatory, but for managing the lifecycle of operator OLM can help. OLM has components that will help to upgrade the operator automatically when a new version is available. So no human interruption needed.&lt;/p&gt;
&lt;h4&gt;
  
  
  OLM Operator
&lt;/h4&gt;

&lt;p&gt;The OLM Operator is responsible for deploying applications defined by CSV(Cluster Service Version).&lt;/p&gt;
&lt;h4&gt;
  
  
  Catalog Operator
&lt;/h4&gt;

&lt;p&gt;The Catalog Operator is responsible for watching CatalogSources for updates of packages in channels and upgrading them to the latest available versions.&lt;/p&gt;

&lt;p&gt;So now the question is what this CSV and CatalogSource.As I mentioned above olm components are operators so it will have custom resource definitions exposed. So CSV, CatalogSources are part of CRD's exposed by OLM operators.&lt;/p&gt;

&lt;p&gt;The below diagram shows the list of  CRD’s owned by the olm operator and catalog operator.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F489d6lo2n5wqqoxgoe6a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F489d6lo2n5wqqoxgoe6a.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmey3rw74afmsoomu4o98.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmey3rw74afmsoomu4o98.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h1&gt;
  
  
  Deploying Operators using OLM
&lt;/h1&gt;

&lt;p&gt;OK! Now we are ready with the cluster which has olm deployed, so the next thing is we need an operator to deploy in our cluster.So let's visit &lt;a href="https://operatorhub.io/" rel="noopener noreferrer"&gt;&lt;strong&gt;OpertorHub&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ehwi6demqeidsyex0f4.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ehwi6demqeidsyex0f4.jpg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"OperatorHub.io is a new home for the Kubernetes community to share Operators. Find an existing Operator or list your own today. "&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In a simple way, I can say Operator hub is like docker hub and it has operator catalogs in it.&lt;/p&gt;

&lt;p&gt;For this article, I am going to download and install the ArgoCD operator from the OpertorHub.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzygi3vgmycntdtwwql6a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzygi3vgmycntdtwwql6a.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the install button and it will show the kubectl command which will install the operator in Kubernetes cluster.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kubectl create -f https://operatorhub.io/install/argocd-operator-helm.yaml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fearlx11atuelwfmlg9ak.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fearlx11atuelwfmlg9ak.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After running the kubectl command, a new namespace created &lt;strong&gt;my-argocd-operator-helm&lt;/strong&gt; and in the namespace, there is an ArgoCD operator pod running.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrjke3rywafsnpil9h2v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrjke3rywafsnpil9h2v.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnaaovt8rs3iogunncln2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnaaovt8rs3iogunncln2.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lra14jdk3kp60tch2j2.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lra14jdk3kp60tch2j2.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  What happened in Backend, how the ArgoCD Operator got deployed?
&lt;/h3&gt;

&lt;p&gt;Let’s open and see the yaml file we applied to the cluster.&lt;br&gt;
&lt;a href="https://operatorhub.io/install/argocd-operator-helm.yaml" rel="noopener noreferrer"&gt;https://operatorhub.io/install/argocd-operator-helm.yaml&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvt759uggag5lxo5kll4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvt759uggag5lxo5kll4.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; First it will create a namespace called &lt;strong&gt;my-argocd-operator-helm&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt; Then it will create a OperatorGroup in the namespace &lt;strong&gt;my-argocd-operator-helm&lt;/strong&gt;, The OperatorGroup kind is a CRD exposed by the olm &lt;/li&gt;
&lt;li&gt; It will also create a &lt;strong&gt;Subscription&lt;/strong&gt; (CRD exposed by the olm). In Subscription there is a filed to mention the channel. The channel (such as alpha, beta, or stable) helps determine which stream of the operator should be deployed from the CatalogSource.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Subscription&lt;/strong&gt; also defines from where the catalogs need be downloaded, in our case it is mentioned as operatorhub&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnyt7mcqmz0koc9uhog8o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnyt7mcqmz0koc9uhog8o.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; When a &lt;strong&gt;Subscription&lt;/strong&gt; is created &lt;strong&gt;Catalog Operator&lt;/strong&gt; will create an &lt;strong&gt;Install Plan&lt;/strong&gt;. By default, OLM will automatically approve updates to an operator as new versions become available via a CatalogSource. &lt;/li&gt;
&lt;li&gt; So as mentioned in the subscription catalog operator contact the OperatorHub for the  operator argocd &lt;/li&gt;
&lt;li&gt; From OperatorHub the latest version from the channel mentioned in the subscription yaml is downloaded and it will have Cluster Service Version.&lt;/li&gt;
&lt;li&gt; The &lt;strong&gt;olm Operator&lt;/strong&gt; will install the operator by reading the Cluster Service Version.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is how the operator got deployed in the namespace my-argocd-operator-helm&lt;/p&gt;

</description>
      <category>kubernetes</category>
      <category>redhat</category>
      <category>olm</category>
      <category>operator</category>
    </item>
    <item>
      <title>Testing with TestProject Docker Agent</title>
      <dc:creator>Sunish Surendran K </dc:creator>
      <pubDate>Sat, 28 Nov 2020 08:05:56 +0000</pubDate>
      <link>https://dev.to/sunishsurendrank/testing-with-test-project-docker-agent-iig</link>
      <guid>https://dev.to/sunishsurendrank/testing-with-test-project-docker-agent-iig</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;"Well-run tests will help you avoid serious grief in the long run"&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The above quote is of great significance as I remember my dark days of software development where I feel like I entered a dark alley without my slingshot or leather sandals.&lt;/p&gt;

&lt;p&gt;Of course, I'm referring to proper Testing and avoiding fallacious approaches. But now we came forward from those dark days, the contemporary &lt;strong&gt;&lt;em&gt;Testing&lt;/em&gt;&lt;/strong&gt; techniques emerged and it has propelled the need for an efficient, reliable, fast, and powerful automation tool that can fit the diverse skillsets of everyone on an agile team. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Here I am introducing you to one of the newest entrant "TestProject"&lt;/em&gt;&lt;/strong&gt; &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;An open-source friendly, community-powered automation testing tools. The most amazing part is it has a &lt;strong&gt;&lt;em&gt;free forever plan&lt;/em&gt;&lt;/strong&gt; which is fully featured and that you can get started within seconds.&lt;/p&gt;

&lt;p&gt;Personally, I have tried TestProject for my research projects.  I can strongly argue that it is of high quality and experience so far has been &lt;strong&gt;&lt;em&gt;top-notch&lt;/em&gt;&lt;/strong&gt;.  So let's dig into it.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  💻How to start with TestProject
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Step 1 - &lt;strong&gt;&lt;em&gt;Signup for free&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;All we need to do is sign up to &lt;a href="https://testproject.io/" rel="noopener noreferrer"&gt;Testproject&lt;/a&gt;.You can check the &lt;a href="https://docs.testproject.io/getting-started/creating-an-account" rel="noopener noreferrer"&gt;documentation&lt;/a&gt; if you need help in free signup.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5xh24o9ik06hn55j6xj6.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5xh24o9ik06hn55j6xj6.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you have created an account with TestProject you just need to follow a few simple steps to get setup and ready for testing.  &lt;strong&gt;&lt;em&gt;TestProject works on almost any platform, with the install of a single agent.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Step 2 - &lt;strong&gt;&lt;em&gt;Generate an API key&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Generate an API key which allows access to API for your account. In order to do that, open the TestProject application and go to the Integration tab and select the API option.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F22owyd9r2uqijnpa76pu.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F22owyd9r2uqijnpa76pu.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
From there you can create a new API key and set the access you want it to have.Once you've setup that key you can copy it and use it in API call that you make.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  🐳 How to setup TestProject Docker Agent
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2qsfca3uk2q4r7ojs5md.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2qsfca3uk2q4r7ojs5md.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When it comes to the CICD team they have already get rid of the traditional testing and moved to automation. Nowadays, automated testing is embedded so deeply within our perception of software development, it's hard to imagine one without the other. And since that ultimately enables us to produce software quickly without sacrificing quality.&lt;/p&gt;

&lt;p&gt;And Docker is the de-facto standard to build, run, test, and share containerized apps from your desktop to the cloud. So for this world &lt;strong&gt;&lt;em&gt;TestProject comes with Docker Agent concept&lt;/em&gt;&lt;/strong&gt;. Moreover, the docker agent is useful because it saves a bunch of resources &amp;amp; provides the ability to run tests with just a simple command.&lt;/p&gt;
&lt;h3&gt;
  
  
  Prerequites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;In your machine Docker should be Installed&lt;/li&gt;
&lt;li&gt;In this article, we are installing the agent with docker compose yaml file so Docker Compose also should be installed.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Step 1 -&lt;strong&gt;&lt;em&gt;Docker-Compose for TestProject Docker Agent&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Visit the TestProject &lt;a href="https://hub.docker.com/r/testproject/agent" rel="noopener noreferrer"&gt;dockerhub&lt;/a&gt; page, you can see on that page TestProject team explained all the different ways to spin the docker agent.We are using the docker-compose way. The following docker-compose snippet can be used to start a TestProject Agent with headless Chrome &amp;amp; Firefox browsers.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# NOTE: Make sure to update the 'testproject-agent' container volume with a valid local path.&lt;/span&gt;
&lt;span class="c1"&gt;# To execute this docker-compose, store it in a file (e.g. testproject-agent.yaml) an run:&lt;/span&gt;
&lt;span class="c1"&gt;# docker-compose -f &amp;lt;file_name&amp;gt; up -d&lt;/span&gt;

&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3.1"&lt;/span&gt;
&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;testproject-agent&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;testproject/agent:latest&lt;/span&gt;
    &lt;span class="na"&gt;container_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;testproject-agent&lt;/span&gt;
    &lt;span class="na"&gt;depends_on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;chrome&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;firefox&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;.:/var/testproject/agent&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;TP_AGENT_ALIAS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Docker&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Agent"&lt;/span&gt;
      &lt;span class="na"&gt;TP_API_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;REPLACE_WITH_YOUR_KEY"&lt;/span&gt;
      &lt;span class="na"&gt;TP_JOB_PARAMS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;"jobParameters"&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"browsers":&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;[&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"chrome",&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"firefox"&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;]&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}'&lt;/span&gt;
      &lt;span class="na"&gt;CHROME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chrome:4444"&lt;/span&gt;
      &lt;span class="na"&gt;FIREFOX&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;firefox:4444"&lt;/span&gt;
  &lt;span class="na"&gt;chrome&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;selenium/standalone-chrome&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;/dev/shm:/dev/shm&lt;/span&gt;
  &lt;span class="na"&gt;firefox&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;selenium/standalone-firefox&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;/dev/shm:/dev/shm&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We need to update the &lt;strong&gt;&lt;em&gt;TP_API_KEY &amp;amp; TP_JOB_ID&lt;/em&gt;&lt;/strong&gt; environment variables so that we instruct the Agent to automatically point to our TestProject account and run the job on startup.&lt;/p&gt;

&lt;p&gt;In our case since we are not ready with job, so I have not mentioned the  TP_JOB_ID environment variable. I have only added the TP_API_KEY that we created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Frvs0tujj2x4fgicsh7xx.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Frvs0tujj2x4fgicsh7xx.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2-&lt;strong&gt;&lt;em&gt;Spinning up the TestProject Docker Agent&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;I have saved my docker compose yaml file with the name TestProject-DockerAgent.yaml.By running the below command the TestProject agent will be up and running.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker-compose &lt;span class="nt"&gt;-f&lt;/span&gt; .&lt;span class="se"&gt;\T&lt;/span&gt;estProject-DockerAgent.yaml up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1596ernl763sken8auou.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1596ernl763sken8auou.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within minutes the docker agent is ready with headless Chrome &amp;amp; Firefox browsers. You can see in the below logs saying "Agent initialization is complete"&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;testproject-agent    | 2020-11-28 14:42:04.491 [INFO ] i.t.a.Program                            *** AGENT - START ***
testproject-agent    | 2020-11-28 14:42:04.742 [INFO ] i.t.a.a                                  TestProject Agent 0.65.10 (64dbf6a7cad45d6e980553444ef4ff002008b3f6) on Linux
testproject-agent    | 2020-11-28 14:42:04.777 [INFO ] i.t.a.a                                  Running as user /home/agent of type Unknown with non-Administrator privileges
testproject-agent    | 2020-11-28 14:42:04.780 [INFO ] i.t.a.a                                  Running inside docker 6a805a004559
testproject-agent    | 2020-11-28 14:42:04.790 [INFO ] i.t.a.a                                  No X11 available - headless mode.
testproject-agent    | 2020-11-28 14:42:04.800 [INFO ] i.t.a.s.IdentityManager                  No identity file was found - Agent is not registered
testproject-agent    | 2020-11-28 14:42:04.804 [INFO ] i.t.a.a                                  Working folder: /opt/testproject/agent
testproject-agent    | 2020-11-28 14:42:04.807 [INFO ] i.t.a.a                                  Data folder: /var/testproject/agent
testproject-agent    | 2020-11-28 14:42:04.816 [INFO ] i.t.a.a                                  Agent IP addresses: 1.1.1.1
testproject-agent    | 2020-11-28 14:42:04.823 [INFO ] i.t.a.m.H                                Checking connectivity with https://testproject.io:443 (Using proxy: No)
testproject-agent    | 2020-11-28 14:42:07.557 [INFO ] i.t.a.m.H                                Connection established successfully.
testproject-agent    | 2020-11-28 14:42:07.560 [INFO ] i.t.a.a                                  Direct connection (no proxy) to TP is possible.
testproject-agent    | 2020-11-28 14:42:08.023 [INFO ] i.t.a.a                                  Agent initialization is complete.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we can go back to our TestProject account and check the agent is showing up there. For that, we need to navigate to the Agents Tab. The new docker agent is showing on the page as shown below. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Feraftrodsirhs5a54jbl.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Feraftrodsirhs5a54jbl.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  🏃 Running the Test Jobs against the Docker Agent
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;Under the Projects Tab, I have created a sample test project. You can follow the instruction in the &lt;a href="https://docs.testproject.io/using-the-smart-test-recorder/web-testing/creating-a-web-test-using-the-testproject-recorder" rel="noopener noreferrer"&gt;documentation&lt;/a&gt; to set up your test.&lt;/p&gt;

&lt;p&gt;Let's run &lt;strong&gt;&lt;em&gt;My First Test&lt;/em&gt;&lt;/strong&gt; test case against our newly created docker agent by clicking  on the play button as shown below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Funhdekmchw3dwxbs0lkb.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Funhdekmchw3dwxbs0lkb.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you press the play button you will receive a popup and it will ask for which agent to select , in our case we will select the &lt;strong&gt;&lt;em&gt;Docker Agent&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7yjcz1jfbxu1jwdp9f8d.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7yjcz1jfbxu1jwdp9f8d.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you press the Run button the job will run in the Docker Agent and you can see the logs below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnow0apzt3gj1qpfyft04.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnow0apzt3gj1qpfyft04.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;testproject-agent    | 2020-11-28 17:35:13.207 [INFO ] i.t.a.m.c.p                              Successfully handled report for execution KXfcPG0yzUCFyP2Sf24CpQ
testproject-agent    | 2020-11-28 17:35:13.214 [INFO ] i.t.a.m.c.p                              Uploading attachments for execution KXfcPG0yzUCFyP2Sf24CpQ
testproject-agent    | 2020-11-28 17:35:13.217 [INFO ] i.t.a.m.c.p                              Successfully handled attachments for execution KXfcPG0yzUCFyP2Sf24CpQ
testproject-agent    | 2020-11-28 17:35:13.264 [INFO ] i.t.a.m.c.p                              Successfully finished execution KXfcPG0yzUCFyP2Sf24CpQ reporting.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F78m54a101c2l2lfo2rox.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F78m54a101c2l2lfo2rox.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  📑 Checking Test Reports
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;TestProject one of the premium feature that they give us for free&lt;br&gt;
is the report dashboard.&lt;/em&gt;&lt;/strong&gt; To explore Reports we need to navigate to the tab Reports, here we can see the report of our last few runs in the docker agent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu77sow0dutl8isp1yegp.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu77sow0dutl8isp1yegp.JPG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  📚 Summary
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;If you’re looking out for a $0 cloud-based, SaaS test automation development framework designed for your agile team, TestProject is the right choice for you.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/klZBxHoFLN44M/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/klZBxHoFLN44M/giphy.gif" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;"It is manageable, quick and reliable !!!"&lt;/p&gt;

&lt;p&gt;As it is built on top of industry-standard open-source tools (Selenium &amp;amp; Appium), supports all major operating systems, and ensures quality with speed using advanced built-in recording capabilities, addons, reports and analytics dashboards, or develop coded tests using TestProject’s powerful SDK for Python/Java/C#!.So in this modern era TestProject suites well!!&lt;/p&gt;

&lt;p&gt;Below are few usefull links about TestProject, &lt;strong&gt;&lt;em&gt;Check it out !!&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://www.youtube.com/testproject" rel="noopener noreferrer"&gt;Official YouTube Channel&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Open Source TestProject's SDK:&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://github.com/testproject-io/" rel="noopener noreferrer"&gt;Open-source Java SDK&lt;/a&gt; | &lt;a href="https://github.com/testproject-io/python-sdk" rel="noopener noreferrer"&gt;Python SDK&lt;/a&gt; | &lt;a href="https://github.com/testproject-io/addons" rel="noopener noreferrer"&gt;TestProject SDK for Addons&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://addons.testproject.io/" rel="noopener noreferrer"&gt;Addons library&lt;/a&gt; | &lt;a href="https://forum.testproject.io/" rel="noopener noreferrer"&gt;Forum&lt;/a&gt;&lt;/p&gt;

</description>
      <category>testing</category>
      <category>webdev</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
