<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Anup Tilak</title>
    <description>The latest articles on DEV Community by Anup Tilak (@anuptilak).</description>
    <link>https://dev.to/anuptilak</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/anuptilak"/>
    <language>en</language>
    <item>
      <title>Understanding JavaScript Module Export and Import</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Mon, 07 Jul 2025 09:45:31 +0000</pubDate>
      <link>https://dev.to/anuptilak/understanding-javascript-module-export-and-import-3l4l</link>
      <guid>https://dev.to/anuptilak/understanding-javascript-module-export-and-import-3l4l</guid>
      <description>&lt;p&gt;In this article, we’ll explore various methods of including external JavaScript module files into your main or controller files. With ES6, we’ve got an excellent feature: the ability to export and import modules. This makes your code more organized, readable, and modular.&lt;/p&gt;

&lt;p&gt;Let’s dive into the topic.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exporting Functions
&lt;/h2&gt;

&lt;p&gt;In a JS file, when you write multiple functions and want to use them in another file, you can use the keyword export. If you want to export more than one function, you can do it like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export { function1, function2, function3 }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Example:&lt;br&gt;
You have a calculation.js file that performs basic math operations. To export these functions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const sub = (a, b) =&amp;gt; {
  return a - b;
}

const add = (a, b) =&amp;gt; {
  return a + b;
}

export { sub, add };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, if you have only one function to export, there’s a special way to do that — using export default&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const multiply = (a, b) =&amp;gt; {
  return a * b;
}

export default multiply;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Importing Functions
&lt;/h2&gt;

&lt;p&gt;Once your modules are neatly organized, it’s time to import them into your main file or controller.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Named Import (Multiple Functions)&lt;/strong&gt;&lt;br&gt;
To import the named exports from calculation.js, use the following&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { sub, add } from './modules/calculation.js';

console.log(sub(10, 20));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: Adjust the path based on your folder structure. Use relative paths (./, ../) instead of absolute ones.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Default Import&lt;/strong&gt;&lt;br&gt;
When you've exported a single function as default:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import multiply from './modules/calculation.js';

console.log(multiply(20, 3));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Import with Namespace&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import * as calc from './modules/calculation.js';

console.log(calc.add(5, 10));
console.log(calc.sub(20, 5));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This way, you refer to all your functions through the calc object.&lt;/p&gt;

&lt;h2&gt;
  
  
  Combining Multiple Modules
&lt;/h2&gt;

&lt;p&gt;Let’s say you have 3–4 module files, and you don’t want to import each one individually in every file. The cleanest way is to combine all exports into a single file, then import from that intermediate file wherever needed.&lt;/p&gt;

&lt;p&gt;Combine.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export * from './modules/calc.js';
export * from './modules/names.js';
export { default as seasons } from './modules/seasons.js';
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, in your controller or main JS file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import * as combine from './modules/Combine.js';

console.log(combine.multiply(22, 8));
console.log(combine.seasons.whichSeason());
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this setup, Combine.js acts as a bridge between your modules and your main file.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;That’s it for today! I’ll be back soon with another article. I hope this guide helped clarify how export and import work in JavaScript. Feel free to share your thoughts or questions in the comments.&lt;/p&gt;

&lt;p&gt;Thanks for reading — cheers! 🙌&lt;/p&gt;

</description>
      <category>programming</category>
      <category>javascript</category>
      <category>es6</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Streamlining PM2 Startup for Node.js Applications: A Comprehensive Guide</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Tue, 27 Feb 2024 08:02:22 +0000</pubDate>
      <link>https://dev.to/anuptilak/streamlining-pm2-startup-for-nodejs-applications-a-comprehensive-guide-8o</link>
      <guid>https://dev.to/anuptilak/streamlining-pm2-startup-for-nodejs-applications-a-comprehensive-guide-8o</guid>
      <description>&lt;p&gt;Ensuring the continuous operation of Node.js applications is essential for maintaining their availability and reliability. PM2, a leading process manager for Node.js applications, offers a streamlined solution for automating the startup process, enabling applications to persist across system reboots and failures. In this guide, we'll explore the steps to set up PM2 for automatic startup, covering various init systems and customisation options.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Supported Init Systems:&lt;/strong&gt;&lt;br&gt;
PM2 boasts compatibility with a diverse range of init systems:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;systemd&lt;/li&gt;
&lt;li&gt;upstart&lt;/li&gt;
&lt;li&gt;launchd&lt;/li&gt;
&lt;li&gt;openrc&lt;/li&gt;
&lt;li&gt;rcd&lt;/li&gt;
&lt;li&gt;systemv&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Supported OS:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;systemd: Ubuntu &amp;gt;= 16, CentOS &amp;gt;= 7, Arch, Debian &amp;gt;= 7&lt;/li&gt;
&lt;li&gt;upstart: Ubuntu &amp;lt;= 14&lt;/li&gt;
&lt;li&gt;launchd: Darwin, MacOSx&lt;/li&gt;
&lt;li&gt;openrc: Gentoo Linux, Arch Linux&lt;/li&gt;
&lt;li&gt;rcd: FreeBSD&lt;/li&gt;
&lt;li&gt;systemv: Centos 6, Amazon Linux&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Generating a Startup Script:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pm2 startup
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After executing this command you'll get bunch of lines printed on terminal, you need to look for following lines&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[PM2] You have to run this command as root. Execute the following command:
      sudo su -c "env PATH=$PATH:/home/unitech/.nvm/versions/node/v14.3/bin pm2 startup &amp;lt;distribution&amp;gt; -u &amp;lt;user&amp;gt; --hp &amp;lt;home-path&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now copy the generated command and run with Sudo,"&lt;strong&gt;distribution, user, and home-path&lt;/strong&gt;" will get automatically replaced with the respective values. If you wish to run it from different user, you can change those values. Following is example command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo env PATH=$PATH:/home/ec2-user/.nvm/versions/node/v16.19.1/bin /home/ec2-user/.nvm/versions/node/v16.19.1/lib/node_modules/pm2/bin/pm2 startup systemd -u ec2-user --hp /home/ec2-user
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;(Don't copy this command from above code box, copy what is generated on your terminal, this machine specific command)&lt;/p&gt;

&lt;p&gt;With above command you've made PM2 will start automatically after restart or if you generate new servers from AMI. &lt;/p&gt;

&lt;p&gt;Now if your Node.JS application isn't already started, start your application,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PM2 start path to your start up file --name app name

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It'll look like something like this, &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhaht6gwgl6buvw1srd9u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhaht6gwgl6buvw1srd9u.png" alt="PM2 list" width="800" height="51"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Saving the App List:&lt;/strong&gt;&lt;br&gt;
After initiating the desired applications, safeguard them against system restarts:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pm2 save
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Manual Process Resurrection:&lt;/strong&gt;&lt;br&gt;
Manually restart previously saved processes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pm2 resurrect
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Disabling Startup System:&lt;/strong&gt;&lt;br&gt;
Temporarily disable or remove the current startup configuration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pm2 unstartup
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Updating Startup Script:&lt;/strong&gt;&lt;br&gt;
Incorporate changes to the Node.js version:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pm2 unstartup
pm2 startup
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;SystemD installation checking&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;systemctl list-units
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will output all systemctl users, you need to look for pm2 user entry in the list. If you find the user in the list, then you've successfully added the PM2 to be started automatically after restart. &lt;/p&gt;

&lt;p&gt;By following these streamlined steps and utilizing the provided commands, developers can automate PM2 startup processes efficiently, ensuring the seamless and persistent operation of their Node.js applications across diverse environments.&lt;/p&gt;

</description>
      <category>node</category>
      <category>pm2</category>
      <category>autorestart</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Leadership Lessons from the Heart: Embracing Humility, Empathy, and Communication</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Thu, 17 Aug 2023 12:28:15 +0000</pubDate>
      <link>https://dev.to/anuptilak/leadership-lessons-from-the-heart-embracing-humility-empathy-and-communication-2nnl</link>
      <guid>https://dev.to/anuptilak/leadership-lessons-from-the-heart-embracing-humility-empathy-and-communication-2nnl</guid>
      <description>&lt;p&gt;In the journey of leadership, some are born with innate abilities, while others strive to develop their skills through hard work and dedication. For me, drawing inspiration from the armed forces has been a lifelong passion, even though I could not join them. However, this has not stopped me from adopting their profound leadership principles. Throughout my experience, I have come to realise that humility and empathy are the cornerstone qualities for any leader. Instead of asserting authority, true leadership lies in fostering genuine connections and earning the trust and loyalty of the team. With this perspective, I have been able to lead my team towards collective success.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Humility and Empathy - The Essence of Genuine Leadership:&lt;/strong&gt;&lt;br&gt;
I believe that leaders who embrace humility and empathy create an environment where team members feel valued and motivated to perform their best. As I strive to lead with compassion and understanding, my team knows that I respect them not just as colleagues but as individuals. This sense of camaraderie strengthens our bond and paves the way for success in both personal and professional realms.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Emulating the Armed Forces - Leaving a Lasting Impact on My Leadership Style:&lt;/strong&gt;&lt;br&gt;
While I could not fulfil my childhood dream of joining the armed forces, my fascination with their leadership principles has significantly influenced my approach as a leader. One vital lesson I have imbibed from them is taking responsibility for both successes and failures. I passionately believe in the mantra, "Success is ours, and failure is mine," instilling a sense of accountability within the team and promoting an environment of openness and learning from mistakes. Interestingly other day I was scrolling through YouTube shorts, and I came across former President Dr APJ Abdul Kalam story. It goes like this, former ISRO Chairman Satish Dhawan took responsibility when a SLV-3 mission satellite headed by late President Dr APJ Abdul Kalam fell into Bay of Bengal in 1979 however next year when mission was successful, Director told then Dr APJ sir to conduct the press conference, giving full credit to former President Dr APJ sir and his team keeping himself out of limelight.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Knowing My Team Personally - Nurturing Compassionate Leadership:&lt;/strong&gt;&lt;br&gt;
I value the power of personal connections with my team members. By being genuinely interested in their lives and understanding their challenges, including medical issues affecting their families. As a team when you lookout for your colleague in his challenging time enabling him to take care of his situation makes your team bond grow stronger. Empathising with their situations fosters trust and a sense of belonging, creating a unified and motivated team. A couple of years back I had a privilege to work with one of the best team. This was my first time lead the team. In the beginning I was clueless what I am doing and what needs to be done, slowly I got hang of the job and I started having meetings with my team as a whole and individually as well. Knowing them personally and professionally, aligning their goals with company goals. As a team identifying areas of improvements and giving them space and platform to highlight their abilities in front of client in client meetings rather than I am doing everything.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Embracing Vulnerability - Fostering Trust and Growth:&lt;/strong&gt;&lt;br&gt;
Sharing my vulnerabilities with the team has proven to be a catalyst for building trust. I encourage my colleagues to do the same, promoting an atmosphere of authenticity and mutual support. This approach allows us to complement each other's strengths and collectively tackle weaknesses, propelling us towards success with resilience and unity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communication - The Backbone of Team Growth:&lt;/strong&gt;&lt;br&gt;
In today's virtual work environment, effective communication is paramount. Regularly connecting with each team member individually and as a group helps me understand their perspectives, concerns, and aspirations. Moreover, it provides opportunities to groom potential leaders within the team, ensuring that growth is a shared journey, benefiting the entire team.&lt;/p&gt;

&lt;p&gt;Through my leadership journey, I have learned that true leadership is not about wielding authority but about embracing humility, empathy, and effective communication. By drawing inspiration from the armed forces and cultivating a compassionate leadership style, I have fostered an environment where my team members feel valued, motivated, and ready to contribute their best. Together, we celebrate success as a team and face failure as a collective learning experience, reinforcing our bonds and empowering each other to achieve greatness. As I continue to lead with my heart, I am confident that our team will thrive and achieve even greater heights of success.&lt;/p&gt;

&lt;p&gt;This is my smallest attempt to share my journey and learnings with you all. Hopefully, this helps upcoming leaders and/or who are already at managerial positions. Keep Learning, Keep Growing. Cheers.&lt;/p&gt;

</description>
      <category>leadership</category>
      <category>learning</category>
      <category>motivation</category>
    </item>
    <item>
      <title>Publish PM2 logs to AWS CloudWatch</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Tue, 01 Aug 2023 09:46:06 +0000</pubDate>
      <link>https://dev.to/anuptilak/publish-pm2-logs-to-aws-cloudwatch-1mop</link>
      <guid>https://dev.to/anuptilak/publish-pm2-logs-to-aws-cloudwatch-1mop</guid>
      <description>&lt;p&gt;Pushing your PM2 logs from an EC2 machine to AWS CloudWatch requires a few crucial steps. In this article, we'll go through each step in detail, but before we begin, it's essential to understand that EC2 logs will not automatically be pushed to CloudWatch. To facilitate this process, a CloudWatch agent needs to be installed on your EC2 instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oqt8mvGC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1690878868470/e3ea8bb4-9009-41aa-8d56-6b1a0f232a19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oqt8mvGC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1690878868470/e3ea8bb4-9009-41aa-8d56-6b1a0f232a19.png" alt="" width="525" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To begin with, ensure that the correct IAM permissions have been set up. Here's a template for the permissions required:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "logs:CreateLogGroup",
        "logs:CreateLogStream",
        "logs:PutLogEvents",
        "logs:DescribeLogStreams"
      ],
      "Resource": ["*"]
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Alternatively, you can use logs:*" for simplicity and add it to the existing group&lt;/p&gt;

&lt;p&gt;To install the CloudWatch log agent, use the following command:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;sudo yum install y awslogs&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Note that this command is specific to Amazon Linux, and you may need to adjust it based on your Linux distribution.&lt;/p&gt;

&lt;p&gt;Next, update your region in &lt;code&gt;/etc/awslogs/awscli.conf&lt;/code&gt;. By default, it points to&lt;/p&gt;

&lt;p&gt;&lt;code&gt;us-east-1&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[plugins]
cwlogs = cwlogs
[default]
region = ap-southeast-1

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To specify the logs to be tracked, edit &lt;code&gt;/etc/awslogs/awslogs.conf&lt;/code&gt;. By default, this file tracks logs from &lt;code&gt;/var/log/messages&lt;/code&gt;. To get logs from your specific files, change the configuration. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[/var/log/Your-Chosen-Name/error.log]
datetime_format = %b %d %H:%M:%S
file = /var/log/Your-Chosen-Name/error.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = Your-LogGroup-Name

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, &lt;code&gt;log_stream_name = {instance_id}&lt;/code&gt; signifies that the log stream will be named after the instance id of the EC2 instance sending the logs. The &lt;code&gt;initial_position = start_of_file&lt;/code&gt; tells the agent to start reading from the beginning of the file. Lastly, &lt;code&gt;log_group_name = Your-LogGroup-Name&lt;/code&gt; refers to the name of the log group on CloudWatch. If it doesn't already exist, CloudWatch will create it for you.&lt;/p&gt;

&lt;p&gt;To send your PM2 logs to this new location, you'll need to modify the &lt;code&gt;ecosystem.config.js&lt;/code&gt; file like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = {
  apps: [{
    name: "Your-App-Name",
    script: "Start-Up-File-Name",
    error_file: "/var/log/Your-Chosen-Name/error.log",
    out_file: "/var/log/Your-Chosen-Name/out.log",
    watch: true,
    env: {
      NODE_ENV: 'Your-ENV'
    }
  }]
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save this file and restart PM2. With these steps, your logs are now being saved in the new directory.&lt;/p&gt;

&lt;p&gt;Finally, start the CloudWatch agent using &lt;code&gt;sudo service awslogs start&lt;/code&gt; (or &lt;code&gt;sudo systemctl start awslogsd&lt;/code&gt; if you're using Amazon Linux). To ensure the agent starts upon system reboot, run &lt;code&gt;sudo systemctl enable awslogsd.service&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Now, login to the AWS console, navigate to CloudWatch, and check the 'logs' tab. Here, you should be able to find your log group and see your logs streaming from your EC2 instance. For more detailed instructions, check out the AWS documentation &lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartEC2Instance.html"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS EIC - Amazon EC2 Instance Connect</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Fri, 07 Jul 2023 06:28:04 +0000</pubDate>
      <link>https://dev.to/anuptilak/aws-eic-amazon-ec2-instance-connect-3l5</link>
      <guid>https://dev.to/anuptilak/aws-eic-amazon-ec2-instance-connect-3l5</guid>
      <description>&lt;p&gt;Amazon Web Services (AWS) has unveiled a fresh feature known as the Amazon EC2 Instance Connect (EIC) Endpoint. This innovative feature permits users to establish a secure connection to their instances and various resources within the Amazon Virtual Private Cloud (Amazon VPC) directly from the internet.&lt;/p&gt;

&lt;p&gt;Previously, users had to navigate through a somewhat complex process to establish connections. They had to connect to a bastion host with a public IP address set up by their administrator over an Internet Gateway (IGW) inside their VPC. Further, they had to use port forwarding to reach their intended destination. The advent of the EIC Endpoint erases the need for an IGW within their VPC, a public IP address on their resource, a bastion host, or any agent to establish connections to their resources.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0fD1vP4p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1688710633825/3404314a-fbb6-4eeb-8509-f41c4fdf6d35.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0fD1vP4p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1688710633825/3404314a-fbb6-4eeb-8509-f41c4fdf6d35.png" alt="" width="800" height="651"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The EIC Endpoint marries identity-based with network-based access controls, aiming to meet an organization's security requirements. It promises to provide isolation, control, and comprehensive logging. Furthermore, the EIC Endpoint eases the load on the organization's administrator by removing the operational tasks associated with maintaining and updating bastion hosts for connectivity. It is compatible with the AWS Management Console and AWS Command Line Interface (AWS CLI), while still offering users the flexibility to use tools like PuTTY and OpenSSH.&lt;/p&gt;

&lt;p&gt;The EIC Endpoint operates as an identity-aware TCP proxy and provides two operating modes. The first mode allows secure WebSocket tunneling from the workstation to the endpoint using AWS Identity and Access Management (IAM) credentials. This mode enables users to connect to resources in the usual way. Meanwhile, the second mode comes into play when not utilizing the AWS CLI. The Console guarantees secure access to VPC resources by evaluating authentication and authorization before allowing traffic into the VPC.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Getting Started with Amazon Web Services Virtual Private Cloud: A Beginner's Guide</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Mon, 12 Jun 2023 10:07:32 +0000</pubDate>
      <link>https://dev.to/anuptilak/getting-started-with-amazon-web-services-virtual-private-cloud-a-beginners-guide-a32</link>
      <guid>https://dev.to/anuptilak/getting-started-with-amazon-web-services-virtual-private-cloud-a-beginners-guide-a32</guid>
      <description>&lt;p&gt;Are you just starting on your Amazon Web Services (AWS) cloud computing journey? The Virtual Private Cloud (VPC), one of the many services that AWS provides, is one that you'll undoubtedly run into. In this article, we'll gradually dive into understanding AWS VPC and why it's so important for those just getting started with cloud computing.&lt;/p&gt;

&lt;p&gt;A VPC is a virtual network within the AWS ecosystem where you can launch AWS resources like databases and application servers. It's like having your slice of the AWS cloud that is logically isolated from everyone else's, giving you complete control over your virtual networking environment. For example, having your room in the hostel without a roommate where you can control the entire setting. Though you are part of the entire hostel, you still have your piece of the room.&lt;/p&gt;

&lt;p&gt;Creating a VPC is like setting up your network in a traditional on-premises data centre but with the added advantages of scalability, security, and integration with powerful AWS services. Your VPC can span multiple AWS availability zones, enhancing the redundancy and reliability of your applications.&lt;/p&gt;

&lt;p&gt;The major benefit of VPC is, the flexibility it provides. If you are thinking of launching a public web application, you can set up a public VPC or if you want to have a secure backend, you can achieve that too with a private subnet. Choose your network the way your application needs it and set it up. All these customisations are available at your fingertips through a web console.&lt;/p&gt;

&lt;p&gt;Another important aspect is security. In an AWS VPC, you control your network's access points, both inbound and outbound. You can use security groups and network access control lists (ACLs) to provide stringent security measures. Plus, you can leverage VPC Flow Logs for network traffic monitoring and troubleshooting.&lt;/p&gt;

&lt;p&gt;Getting started with AWS VPC might seem very difficult at the start, but remember, you'll get there. Don't feel pressured to understand everything at once. Begin by familiarising yourself with its basic concepts, like subnets, route tables, and internet gateways. From there, experiment, learn, and grow. AWS provides plenty of documentation and resources to guide you along the way. AWS FAQs is a great place to start. Most of your questions will get answered there.&lt;/p&gt;

&lt;p&gt;In the end, I would like to conclude with, AWS VPC is a powerful tool in the AWS suite that gives you control, security, and flexibility. Embracing it early on sets a solid foundation for your cloud computing journey. With patience, curiosity, and a bit of practice, you'll soon be navigating your AWS VPC with ease. Happy cloud computing!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Working With AWS IAM</title>
      <dc:creator>Anup Tilak</dc:creator>
      <pubDate>Sat, 08 Jan 2022 10:23:44 +0000</pubDate>
      <link>https://dev.to/anuptilak/working-with-aws-iam-3na2</link>
      <guid>https://dev.to/anuptilak/working-with-aws-iam-3na2</guid>
      <description>&lt;p&gt;Amazon Web Services started in 2002 with a simple SQS service and now it has expanded in 2000+ services over the years. Every year AWS holds one event called AWS re: invent where all the AWS communities from the world gather together and AWS announces new services coming into play with the upcoming year. By default, all services are always available in the us-east-1 region and gradually it expands its footprint in other regions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IAM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this article, we'll be talking about AWS IAM i.e. AWS Identity and Access Management service which can use IAM to securely control individual and group access to your AWS resources. You can create and manage user identities ("IAM users") and grant permissions for those IAM users to access your resources.&lt;/p&gt;

&lt;p&gt;To access IAM you need to have access to at least one AWS service that is integrated with IAM. With this, you can manage users, groups, permissions via AWS CLI or AWS Console.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Users&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The users who are created with IAM are called IAM users. You've one root user who is a superuser of your AWS account and then you've IAM users. Typically in the organization, you create an Administrator user who'll take care of creating other users and groups and their related policies. Groups are created with a set of users who has common permissions. You can create more than one group and one user can be part of more than one group but one group can't be part of other groups.&lt;/p&gt;

&lt;p&gt;Every IAM user who gets created within AWS with programmatic access has Access Key and Secret Key. These keys can be used to access all AWS services through SDK or REST API or AWS CLI. Please make sure that you never share those keys with anyone or directly use them in any code.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Policies&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AWS gives you a set of standard policies with which users can access AWS services. You can use AWS Web console to assign policies or you can use JSON format. Both of these ways are available through the AWS console. Below is an example of such JSON policy for EC2 Resource.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "ec2:AttachVolume",
                "ec2:DetachVolume"
            ],
            "Resource": [
                "arn:aws:ec2:*:*:volume/*",
                "arn:aws:ec2:*:*:instance/*"
            ],
            "Condition": {
                "ArnEquals": {"ec2:SourceInstanceARN": "arn:aws:ec2:*:*:instance/instance-id"}
            }
        }
    ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Security&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;IAM is also used to formulate security policies and enforce them on the users. These policies are mainly, setting up the minimum length of the password, strength, and combination of the password, password rotation days, MFA. MFA is multi-factor authentication that can be availed with Google Authenticator or Authy. There are other physical devices as well but usage of those devices is very rare.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Security Tools&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;IAM provides two security tools to run reports on your AWS account.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;IAM Credentials Report - This report provides you an account-level list of all users &amp;amp; the status of their various credentials.&lt;/li&gt;
&lt;li&gt;IAM Access Advisor - This report gives you an overview of users who received which permissions and when that permission was accessed last. Usually, to populate this report for the users, take 4 hours and the data is retained for 365 days. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Best Practices of IAM Policies&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Don't use Root User&lt;/li&gt;
&lt;li&gt;One Physical User = One AWS user&lt;/li&gt;
&lt;li&gt;Assign users to groups and assign permission to groups&lt;/li&gt;
&lt;li&gt;Create a strong password policy&lt;/li&gt;
&lt;li&gt;Enforce the use of MFA&lt;/li&gt;
&lt;li&gt;Create &amp;amp; use roles for giving permissions to AWS services&lt;/li&gt;
&lt;li&gt;Audit your AWS A/C permission with IAM Credentials and Access Advisor&lt;/li&gt;
&lt;li&gt;Never Ever Share your Access key and Secret key&lt;/li&gt;
&lt;li&gt;Never use them directly in your code. &lt;/li&gt;
&lt;li&gt;ENV files in which you've keys, shouldn't be pushed on any GIT repo. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AWS IAM is the service that helps you to govern your account and your users. With the help of the IAM users, groups and policies are created to manage AWS a/c users effectively. AWS IAM is also used to comply with your account for PCI DSS compliances. In all cases, you've to follow IAM best practices to ensure your account safety.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/iam/faqs/"&gt;IAM FAQs&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
