<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sudhanshu Mukherjee</title>
    <description>The latest articles on DEV Community by Sudhanshu Mukherjee (@sudhanshumukherjeexx).</description>
    <link>https://dev.to/sudhanshumukherjeexx</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sudhanshumukherjeexx"/>
    <language>en</language>
    <item>
      <title>How to use AWS Rekognition for Object Detection</title>
      <dc:creator>Sudhanshu Mukherjee</dc:creator>
      <pubDate>Wed, 12 Jul 2023 01:03:43 +0000</pubDate>
      <link>https://dev.to/sudhanshumukherjeexx/how-to-use-aws-rekognition-for-object-detection-517j</link>
      <guid>https://dev.to/sudhanshumukherjeexx/how-to-use-aws-rekognition-for-object-detection-517j</guid>
      <description>&lt;p&gt;Amazon Web Services (AWS) is the leading cloud computing platform offered by Amazon.com. It provides a comprehensive suite of cloud-based services that enable individuals and organizations to build, deploy, and manage applications and infrastructure with ease and scalability. From computing power to storage, databases to machine learning, AWS offers a vast array of services that cater to a wide range of requirements. In this article, we will explore some of the key services provided by Amazon AWS.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Elastic Compute Cloud (EC2): EC2 is a highly scalable virtual server service that allows users to rent virtual machines in the cloud. It provides a flexible and secure computing environment, allowing users to configure instances with various operating systems, storage options, and networking capabilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Storage Service (S3): S3 is a highly durable and scalable object storage service that enables users to store and retrieve vast amounts of data. It offers high availability, data encryption, and easy integration with other AWS services, making it an ideal choice for storing backups, static content for websites, and big data analytics.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Relational Database Service (RDS): RDS simplifies the setup, operation, and scaling of relational databases in the cloud. It supports popular database engines such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. RDS automates tasks like backups, software patching, and database scaling, allowing users to focus on their applications rather than database management.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Lambda: AWS Lambda is a serverless computing service that enables users to run code without provisioning or managing servers. It allows developers to build applications using functions that are triggered by events, providing a highly scalable and cost-effective solution for running code in response to various events.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Identity and Access Management (IAM): IAM is a service that helps manage user access and permissions for AWS resources. It allows users to create and manage user accounts, roles, and groups, providing fine-grained control over who can access specific resources and actions within the AWS environment.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Elastic Beanstalk: Elastic Beanstalk is a fully managed service for deploying and scaling web applications. It provides an easy-to-use platform that abstracts the underlying infrastructure, allowing developers to focus on writing code. Elastic Beanstalk supports various programming languages and frameworks, making it a versatile choice for deploying web applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Queue Service (SQS): SQS is a fully managed message queuing service that enables decoupling of components in distributed systems. It allows applications to send, store, and receive messages between different software components, helping to build scalable and fault-tolerant systems.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Notification Service (SNS): SNS is a highly flexible and scalable messaging service that enables the pub/sub (publish/subscribe) messaging pattern. It allows applications to send notifications to multiple recipients or subscribers via email, SMS, mobile push, or other custom endpoints.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Machine Learning Services: AWS provides a suite of services for machine learning and artificial intelligence (AI) applications. This includes Amazon SageMaker, which simplifies the process of building, training, and deploying machine learning models, as well as services like Amazon Rekognition for image and video analysis, Amazon Comprehend for natural language processing, and Amazon Polly for text-to-speech conversion.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Internet of Things (IoT): AWS offers a comprehensive set of services for building and managing IoT applications. This includes AWS IoT Core, a managed cloud platform for securely connecting and managing devices, as well as services like AWS IoT Analytics for data analysis, AWS IoT Greengrass for offline and edge computing, and AWS IoT Device Management for device provisioning and management.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These are just a few examples of the wide range of services provided by Amazon AWS. With its vast portfolio of offerings, AWS caters to the needs of startups, enterprises, and developers alike, empowering them to innovate and build scalable, reliable, and cost-effective solutions in the cloud. Whether it's computing power, storage, databases, machine learning, or IoT, AWS provides the building blocks to support a wide variety of applications and workloads in the cloud.&lt;/p&gt;

</description>
      <category>computervision</category>
      <category>machinelearning</category>
      <category>datascience</category>
    </item>
    <item>
      <title>Amazon AWS and It's Services</title>
      <dc:creator>Sudhanshu Mukherjee</dc:creator>
      <pubDate>Wed, 12 Jul 2023 01:00:43 +0000</pubDate>
      <link>https://dev.to/sudhanshumukherjeexx/amazon-aws-and-its-services-28j8</link>
      <guid>https://dev.to/sudhanshumukherjeexx/amazon-aws-and-its-services-28j8</guid>
      <description>&lt;p&gt;Amazon Web Services (AWS) is the leading cloud computing platform offered by Amazon.com. It provides a comprehensive suite of cloud-based services that enable individuals and organizations to build, deploy, and manage applications and infrastructure with ease and scalability. From computing power to storage, databases to machine learning, AWS offers a vast array of services that cater to a wide range of requirements. In this article, we will explore some of the key services provided by Amazon AWS.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Elastic Compute Cloud (EC2): EC2 is a highly scalable virtual server service that allows users to rent virtual machines in the cloud. It provides a flexible and secure computing environment, allowing users to configure instances with various operating systems, storage options, and networking capabilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Storage Service (S3): S3 is a highly durable and scalable object storage service that enables users to store and retrieve vast amounts of data. It offers high availability, data encryption, and easy integration with other AWS services, making it an ideal choice for storing backups, static content for websites, and big data analytics.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Relational Database Service (RDS): RDS simplifies the setup, operation, and scaling of relational databases in the cloud. It supports popular database engines such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. RDS automates tasks like backups, software patching, and database scaling, allowing users to focus on their applications rather than database management.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Lambda: AWS Lambda is a serverless computing service that enables users to run code without provisioning or managing servers. It allows developers to build applications using functions that are triggered by events, providing a highly scalable and cost-effective solution for running code in response to various events.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Identity and Access Management (IAM): IAM is a service that helps manage user access and permissions for AWS resources. It allows users to create and manage user accounts, roles, and groups, providing fine-grained control over who can access specific resources and actions within the AWS environment.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Elastic Beanstalk: Elastic Beanstalk is a fully managed service for deploying and scaling web applications. It provides an easy-to-use platform that abstracts the underlying infrastructure, allowing developers to focus on writing code. Elastic Beanstalk supports various programming languages and frameworks, making it a versatile choice for deploying web applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Queue Service (SQS): SQS is a fully managed message queuing service that enables decoupling of components in distributed systems. It allows applications to send, store, and receive messages between different software components, helping to build scalable and fault-tolerant systems.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Simple Notification Service (SNS): SNS is a highly flexible and scalable messaging service that enables the pub/sub (publish/subscribe) messaging pattern. It allows applications to send notifications to multiple recipients or subscribers via email, SMS, mobile push, or other custom endpoints.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Machine Learning Services: AWS provides a suite of services for machine learning and artificial intelligence (AI) applications. This includes Amazon SageMaker, which simplifies the process of building, training, and deploying machine learning models, as well as services like Amazon Rekognition for image and video analysis, Amazon Comprehend for natural language processing, and Amazon Polly for text-to-speech conversion.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Internet of Things (IoT): AWS offers a comprehensive set of services for building and managing IoT applications. This includes AWS IoT Core, a managed cloud platform for securely connecting and managing devices, as well as services like AWS IoT Analytics for data analysis, AWS IoT Greengrass for offline and edge computing, and AWS IoT Device Management for device provisioning and management.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These are just a few examples of the wide range of services provided by Amazon AWS. With its vast portfolio of offerings, AWS caters to the needs of startups, enterprises, and developers alike, empowering them to innovate and build scalable, reliable, and cost-effective solutions in the cloud. Whether it's computing power, storage, databases, machine learning, or IoT, AWS provides the building blocks to support a wide variety of applications and workloads in the cloud.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>datascience</category>
      <category>machinelearning</category>
      <category>community</category>
    </item>
    <item>
      <title>Web Scraping Multiple Pages using Python &amp; BeautifulSoup</title>
      <dc:creator>Sudhanshu Mukherjee</dc:creator>
      <pubDate>Thu, 27 Jan 2022 06:26:59 +0000</pubDate>
      <link>https://dev.to/sudhanshumukherjeexx/web-scraping-multiple-pages-using-python-beautifulsoup-26dg</link>
      <guid>https://dev.to/sudhanshumukherjeexx/web-scraping-multiple-pages-using-python-beautifulsoup-26dg</guid>
      <description>&lt;h2&gt;
  
  
  Web Scraping
&lt;/h2&gt;

&lt;p&gt;Web Scraping is used to examine websites for Unstructured Data and store it Structurally for our use. Web Scraping helps us iterate over several web pages and extract the required information, It is then stored in a format suitable for the user.&lt;/p&gt;

&lt;p&gt;In this project, we will learn &lt;strong&gt;'How to use Python and Beautiful Soup to scrape Mobile Phone Names and Prices of Brand Mi on a Flipkart E-commerce Website'&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/sudhanshumukherjeexx/WebScrapingFlipkart"&gt;Click here for a code&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Inspecting the Element
&lt;/h3&gt;

&lt;p&gt;The objective of this project is to scrape Flipkart's Website and extract the 'Name' and 'Prices' of Mi Mobile Phones.&lt;br&gt;
Get the URL that will be needed to make a request.&lt;/p&gt;

&lt;p&gt;The URL which we will be using: &lt;em&gt;&lt;a href="https://www.flipkart.com/mobiles/mi%7Ebrand/pr?sid=tyy%2C4io&amp;amp;otracker=nmenu_sub_Electronics_0_Mi&amp;amp;page="&gt;https://www.flipkart.com/mobiles/mi~brand/pr?sid=tyy%2C4io&amp;amp;otracker=nmenu_sub_Electronics_0_Mi&amp;amp;page=&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Now, It is very necessary to identify the right class of the elements stored which we need to scrape. To inspect elements follow these steps.&lt;br&gt;
Step 1 - Visit the URL&lt;br&gt;
Step 2 - Right on the website and select &lt;em&gt;inspect&lt;/em&gt; or press &lt;em&gt;Ctrl&lt;/em&gt; + &lt;em&gt;shift&lt;/em&gt; + &lt;em&gt;I&lt;/em&gt; together.&lt;br&gt;
Step 3 - Hover on the name of the phone and click it. Select the class from the window appearing on the right.&lt;br&gt;
Step 4 - Apply the same process for price.&lt;br&gt;
Step 5 - Copy this class somewhere, we will need it later in our code.&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8dx3hc43023h3lshx75c.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8dx3hc43023h3lshx75c.PNG" alt="Blue Highlighted Text on the right determines class of phone name" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Import Libraries, modules, and Packages
&lt;/h3&gt;

&lt;p&gt;Let's import the required libraries and packages first. To begin with, I will be using Beautiful Soup for parsing HTML documents which can later be used to extract data from HTML. Then, We will import 'requests' which allows us to interact with the web, It contains many useful features and methods to make HTTP requests.&lt;br&gt;
Lastly, we will need pandas to store our scraped data in an organized way. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;from bs4 import BeautifulSoup&lt;br&gt;
import requests&lt;br&gt;
import pandas as pd&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Write the program
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create an empty list for Phone Name and Phone Price.&lt;br&gt;
&lt;code&gt;phone_name = []&lt;br&gt;
phone_price = []&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Declare a variable 'page_num' which will help us take input from users about the number of pages they want to scrape.&lt;br&gt;
&lt;code&gt;page_num = int(input("Enter number of pages:"))&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now we will use &lt;em&gt;for Loop&lt;/em&gt; to iterate over multiple pages looking for relevant information and operations to implement.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Then, Let's store our &lt;em&gt;URL&lt;/em&gt; in the variable &lt;em&gt;url&lt;/em&gt; and make &lt;em&gt;requests&lt;/em&gt; to build a connection. Don't forget to add(+) &lt;em&gt;str(i)&lt;/em&gt;. This will help our program to iterate over multiple pages while using for loop.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After, making a request, let's use Beautiful Soup for parsing HTML and store all the data in a variable called &lt;em&gt;content&lt;/em&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7xqd6pf3y6grllv77asa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7xqd6pf3y6grllv77asa.png" alt="Code" width="800" height="219"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Declare a variable called &lt;em&gt;name&lt;/em&gt; to find and store all &lt;em&gt;div&lt;/em&gt; and the relevant &lt;em&gt;class&lt;/em&gt;(which we copied using inspect element) of phone name.&lt;/li&gt;
&lt;li&gt;Declare a variable called &lt;em&gt;price&lt;/em&gt; to find and store all &lt;em&gt;div&lt;/em&gt; and the relevant &lt;em&gt;class&lt;/em&gt;(which we copied using inspect element) of phone price.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frik626wcrm6be8ty1325.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frik626wcrm6be8ty1325.png" alt="Code" width="800" height="299"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Let's append the data extracted in the empty list which we have created at the start.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn6q7ldg0rjnsemopwncs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn6q7ldg0rjnsemopwncs.png" alt="Image description" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now, let's create a DataFrame to store our data structurally and export DataFrame into &lt;em&gt;CSV&lt;/em&gt; file.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2j7v0i2u9ouebfitnge4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2j7v0i2u9ouebfitnge4.png" alt="Code" width="800" height="299"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is our Aha moment folks. We have successfully created our web scraper using python and Beautiful Soup. To get a better understanding of this, Use a website that you personally like and try scraping it.&lt;/p&gt;

</description>
      <category>python</category>
      <category>tutorial</category>
      <category>beginners</category>
      <category>webscraping</category>
    </item>
  </channel>
</rss>
