<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Shoeb Ahmed</title>
    <description>The latest articles on DEV Community by Shoeb Ahmed (@shoeb370).</description>
    <link>https://dev.to/shoeb370</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/shoeb370"/>
    <language>en</language>
    <item>
      <title>HireLink Contacts 10 December 2025 Embedded System(Electronics)</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Thu, 11 Dec 2025 06:12:39 +0000</pubDate>
      <link>https://dev.to/shoeb370/hirelink-contacts-10-december-2025-embedded-systemelectronics-4fe7</link>
      <guid>https://dev.to/shoeb370/hirelink-contacts-10-december-2025-embedded-systemelectronics-4fe7</guid>
      <description>&lt;p&gt;please click on this and buy this free of cost: &lt;a href="https://shoeb370.gumroad.com/l/gjwcr" rel="noopener noreferrer"&gt;https://shoeb370.gumroad.com/l/gjwcr&lt;/a&gt;&lt;/p&gt;

</description>
      <category>iot</category>
      <category>embeddedsystem</category>
      <category>electronics</category>
      <category>hiring</category>
    </item>
    <item>
      <title>Embedded System curated HR list 5 December 2025</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Wed, 10 Dec 2025 05:55:17 +0000</pubDate>
      <link>https://dev.to/shoeb370/embedded-system-curated-hr-list-5-december-2025-9h3</link>
      <guid>https://dev.to/shoeb370/embedded-system-curated-hr-list-5-december-2025-9h3</guid>
      <description>&lt;p&gt;visit to download: --&amp;gt; &lt;a href="https://shoeb370.gumroad.com/l/bgefy" rel="noopener noreferrer"&gt;https://shoeb370.gumroad.com/l/bgefy&lt;/a&gt;&lt;/p&gt;

</description>
      <category>hremaillist</category>
      <category>recruitercontacts</category>
      <category>jobsearch2025</category>
      <category>jobseekers</category>
    </item>
    <item>
      <title>Run Python code on AWS Batch.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Fri, 12 Aug 2022 09:23:48 +0000</pubDate>
      <link>https://dev.to/shoeb370/run-python-code-on-aws-batch-24en</link>
      <guid>https://dev.to/shoeb370/run-python-code-on-aws-batch-24en</guid>
      <description>&lt;h2&gt;
  
  
  Run Python code on AWS Batch.
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r7oYXmYC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Az8AdjOXd-OaWruIHYF779g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r7oYXmYC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Az8AdjOXd-OaWruIHYF779g.png" alt="AWS Batch Logo" width="347" height="153"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have published 3 articles on how to run our basic python code on AWS batch, it is a step-by-step approach to how to create python code and then how to run our python code on Docker, and how to upload containerized solution on AWS Repository.&lt;/p&gt;

&lt;p&gt;After that, how to run the containerize solution on AWS batch.&lt;/p&gt;

&lt;p&gt;And here is the flow Diagram of AWS Batch&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qQLKUdPY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2088/1%2AJ93wFX4zh92OYxBuxm7BCQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qQLKUdPY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2088/1%2AJ93wFX4zh92OYxBuxm7BCQ.png" alt="Block Diagram of the whole project." width="800" height="252"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1b01dc89eaed"&gt;**Run a Python code on AWS Batch Part — 1: Creation of Python Script and Docker Container.&lt;/a&gt;**&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-c5dab12cd3eb"&gt;**Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.&lt;/a&gt;**&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-3-creation-of-job-queues-job-definitions-and-run-jobs-6ee0dd49e582"&gt;**Run a Python code on AWS Batch Part — 3: Creation of Job Queues, Job definitions and Run Jobs.&lt;/a&gt;**&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
    </item>
    <item>
      <title>Run a Python code on AWS Batch Part — 3: Creation of Job Queues, Job definitions and Run Jobs.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Fri, 12 Aug 2022 09:22:49 +0000</pubDate>
      <link>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-3-creation-of-job-queues-job-definitions-and-run-jobs-43n8</link>
      <guid>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-3-creation-of-job-queues-job-definitions-and-run-jobs-43n8</guid>
      <description>&lt;h2&gt;
  
  
  Run a Python code on AWS Batch Part — 3: Creation of Job Queues, Job definitions and Run Jobs.
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Kv0_k5-P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A-QNpJmRsC7Tv2tKWoDXOkQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Kv0_k5-P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A-QNpJmRsC7Tv2tKWoDXOkQ.jpeg" alt="Logo" width="462" height="198"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the previous articles, we saw how to create a container and upload the container to AWS repositories. if you didn’t visit that article please click on the below name:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1b01dc89eaed"&gt;&lt;strong&gt;Run a Python code on AWS Batch Part — 1: Creation of Python Script and Docker Container&lt;/strong&gt;:&lt;/a&gt; &lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1b01dc89eaed"&gt;https://medium.com/codex/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1b01dc89eaed&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-c5dab12cd3eb"&gt;**Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment&lt;/a&gt;.: &lt;a href="https://medium.com/codex/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-c5dab12cd3eb"&gt;**https://medium.com/codex/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-c5dab12cd3eb&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In this session, we are going to run the python code:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a Job Queues&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Job definitions&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Jobs&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run a Jobs&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let’s start&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a Job Queues&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;First, click on the Job Queues on AWS Dashboard and then click on “Create”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ikgX3EAb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AZHTWqSjhzk1ykbP0gnqslQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ikgX3EAb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AZHTWqSjhzk1ykbP0gnqslQ.jpeg" alt="Job Queues — AWS Batch Dashboard" width="780" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And I am going to give the name in the Job queue name “test-queue-batch-v1” and you can give the name of your own interest. The priority I am going to give is 1000 because Job queues with a higher integer value for priority are given preference for computing environments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lnXpl6ec--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A0ppl3u6HJv4SdrroF3Vg3g.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lnXpl6ec--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A0ppl3u6HJv4SdrroF3Vg3g.jpeg" alt="Creation of job queue — 1" width="470" height="554"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I am going to select the Compute Environment which is necessary that the container will run this Compute Environment. In my case, I have given the Compute Environment name as “demo-batch-python-v1”. Please click on the radio button, make sure the radio button is in blue colour.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_o9h-IBn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AmVZ9wnKT-IIO-_UQZWTf9A.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_o9h-IBn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AmVZ9wnKT-IIO-_UQZWTf9A.jpeg" alt="Creation of Job queue — 2" width="504" height="630"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And click on create and we will see a “VALID” message on the status. And green colour banner will come on the top side of the page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AHW7ogsq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ABFpSaXT0pKAkdFmQt1Uf1A.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AHW7ogsq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ABFpSaXT0pKAkdFmQt1Uf1A.jpeg" alt="Job queue dashboard" width="547" height="579"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a Job definitions&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If we see the left-hand side of the navigation bar of AWS Batch Dashboard we see the Job definitions click on that and click on the “Create” button just like we did on Job queues.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lqExCLPV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AZpMQP2HZ6dgCYg5s6G08pw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lqExCLPV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AZpMQP2HZ6dgCYg5s6G08pw.jpeg" alt="Job definitions dashboard" width="777" height="583"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking the “Create” button, it will enter the form in which we need to fill in certain parameters for job definitions. And I am taking the name “test-job-def-v1” and I am taking the Execution timeout of the code near about 1 hour, if our code execution time will go beyond 1 hour it will terminate the execution of the code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eX3nQyhl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Ax8dRPD8HQclHngL-MIaRMw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eX3nQyhl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Ax8dRPD8HQclHngL-MIaRMw.jpeg" alt="Job Definitions form — 1" width="484" height="676"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have to select EC2 for platform compatibility or you can choose Fargate if you are familiar with that.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2h0gu5dx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aou6IQOtsrOW0pI8xw2iZ4Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2h0gu5dx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aou6IQOtsrOW0pI8xw2iZ4Q.jpeg" alt="" width="494" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that, we are going to give you your Docker Container location where you kept it. You can see that in the below image I stored my Container in sider Amazon ECR and copy that public ECR which is a green colour mark in that image&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cPTKJfJC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2586/1%2ARKhS6I-TE5yMmPJTGZHWZQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cPTKJfJC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2586/1%2ARKhS6I-TE5yMmPJTGZHWZQ.jpeg" alt="" width="800" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And paste that link inside the image field.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZBa6Lcjs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AUa_HCY6VErZbXmgNpin2Mw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZBa6Lcjs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AUa_HCY6VErZbXmgNpin2Mw.jpeg" alt="" width="719" height="575"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After setting up all the job configurations, and then click on create button.&lt;/p&gt;

&lt;p&gt;Now our next task is to run that Image file i.e. Docker Container on AWS Batch by creating a job.&lt;/p&gt;

&lt;p&gt;Now click on &lt;strong&gt;Jobs **on the left-hand side of the navigation bar. And now click on **Submit new job&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ss_rIcCG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3124/1%2AioPdvnO5wTddWmsZse5k9A.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ss_rIcCG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3124/1%2AioPdvnO5wTddWmsZse5k9A.jpeg" alt="" width="800" height="265"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking on Submit new job, we will be redirected to the form page and we have to fill in some detail step by step.&lt;br&gt;
Inside the **Name **field, we can add job name of our own, and we already created the job definition and job queue.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--df3ZTWOH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AQNhfCxjbEtywDzTElVNj1g.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--df3ZTWOH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AQNhfCxjbEtywDzTElVNj1g.jpeg" alt="" width="756" height="693"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we select the **job definition **all the fields are automatically filled by **AWS **itself.&lt;/p&gt;

&lt;p&gt;And same for **job configuration **is filled automatically, we just need to verify.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_AURf88e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2As1PU5mnVb8-oQNjxndln3g.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_AURf88e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2As1PU5mnVb8-oQNjxndln3g.jpeg" alt="" width="721" height="654"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that click on submit. And after that go to &lt;strong&gt;Dashboard&lt;/strong&gt; and we can see our task is submitted.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wD2Du_Ap--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2542/1%2A38j2GzjbCo_RMX3WW2DRmg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wD2Du_Ap--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2542/1%2A38j2GzjbCo_RMX3WW2DRmg.jpeg" alt="" width="800" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see different columns in the above images which show the status of our AWS batch jobs or we can say that different stages of the Batch jobs.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Stages ranges from Submitted -&amp;gt; Runnable -&amp;gt; Starting -&amp;gt;Running -&amp;gt; Succeeded/Failed.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;All the stages are completed automatically, we can observe the output in Succeeded/ Failed columns.&lt;/p&gt;

&lt;p&gt;In my scenario, my job was executed properly and I can see the result by clicking on the number which is **“2”, **I had run that same job previously, that’s why it is showing 2 otherwise it will show 1, and if your job will fail, it will show 1 in Failed columns.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c1D3DzL---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2512/1%2AtxZmU4QcnqewJETCPOB50g.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c1D3DzL---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2512/1%2AtxZmU4QcnqewJETCPOB50g.jpeg" alt="" width="800" height="150"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the above image in my scenario, I will click on 2 and it will be redirected to the list of jobs which is succeeded.&lt;/p&gt;

&lt;p&gt;I am going to click on **“my-job-v1”. **Because we had run the job right now.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZjFnwZK4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2618/1%2Ac7tvYyK3jYucF0uLV_vjcg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZjFnwZK4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2618/1%2Ac7tvYyK3jYucF0uLV_vjcg.jpeg" alt="" width="800" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And it will move to the &lt;strong&gt;Job Information **page. And we see the **Log stream name, **below that there is link is provided please click on that it contains the output. That links will get to the **AWS Cloudwatch&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1Ee4BuIw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2610/1%2AnnnHd2jJERRg2V0QIKky_A.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1Ee4BuIw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2610/1%2AnnnHd2jJERRg2V0QIKky_A.jpeg" alt="" width="800" height="334"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What is AWS Cloudwatch? In my words, I can say that it is used to view the output of any services like AWS Batch, AWS Lambda etc.&lt;/p&gt;

&lt;p&gt;And you can get more detail by just clicking on &lt;a href="https://aws.amazon.com/cloudwatch/"&gt;**AWS Cloudwatch&lt;/a&gt;.**&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oluP35B5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2626/1%2AYUaKRq9zX78Q-4mKAXnISA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oluP35B5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2626/1%2AYUaKRq9zX78Q-4mKAXnISA.jpeg" alt="The output of AWS Batch" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And we can see the above Image, i.e. the output of the jobs.&lt;/p&gt;

&lt;p&gt;And we have successfully run the AWS Batch.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Tue, 19 Jul 2022 07:25:41 +0000</pubDate>
      <link>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-environment-5gne</link>
      <guid>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-environment-5gne</guid>
      <description>&lt;h2&gt;
  
  
  Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.
&lt;/h2&gt;

&lt;p&gt;In the previous article, I posted that how can we create and run a simple python script. And how to create a simple Docker File and Docker container that runs on a local system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XHhfyw52--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2158/1%2AtQdtKoTCpuPQ41V3N7AFew.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XHhfyw52--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2158/1%2AtQdtKoTCpuPQ41V3N7AFew.jpeg" alt="AWS Batch + Python" width="800" height="147"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this article, we are going for the AWS Batch side. For running an AWS Batch we require an Image File (Docker Container) and Computing Environment.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a repository in “&lt;strong&gt;Amazon Elastic Container Registry&lt;/strong&gt;”&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Push the Docker Container into the repository.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Creating a “Compute Environment” in the AWS batch.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;For running a Docker Container on AWS Batch we need to store it on the AWS platform, we are going to use **“Amazon Elastic Container Registry”, **where we can store our Docker Container. Search **Amazon Elastic Container Registry **and we see the dashboard of ECR.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--U7LvFQTG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2134/1%2AkGscCbkHwW9b1PQ3xgtIDw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U7LvFQTG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2134/1%2AkGscCbkHwW9b1PQ3xgtIDw.jpeg" alt="" width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;“Get Started”&lt;/strong&gt;. After that, we will visit to create a repository page.&lt;/p&gt;

&lt;p&gt;I am going to use public because we are in the learning phase.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nhxDbwO1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A4W-dPlUIQ2ObwenCfklaSA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nhxDbwO1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A4W-dPlUIQ2ObwenCfklaSA.jpeg" alt="" width="739" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And give the name of the repository which is a compulsory field.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6Ilz8uie--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AW-hnlnKHz5CoApJByKjAhA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6Ilz8uie--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AW-hnlnKHz5CoApJByKjAhA.jpeg" alt="" width="756" height="213"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I am selecting all **“Content types” **and you can choose your content types.&lt;/p&gt;

&lt;p&gt;And then I am going to click **“Create repository”. **After that our page will redirect to the repository list their we can see our new repository list.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cPTKJfJC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2586/1%2ARKhS6I-TE5yMmPJTGZHWZQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cPTKJfJC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2586/1%2ARKhS6I-TE5yMmPJTGZHWZQ.jpeg" alt="" width="800" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we click on our repository name and after that, we will click on &lt;strong&gt;“View push commands”&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--j9ZPi_LQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2566/1%2A2Q_v0AgPIpK4yPGjNyDppQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--j9ZPi_LQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2566/1%2A2Q_v0AgPIpK4yPGjNyDppQ.jpeg" alt="" width="800" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking on that button we will see a list of push commands we need to follow all the commands.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--G9bSiLPP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AYtQvWFJwJm7mVULtwhU2xw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--G9bSiLPP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AYtQvWFJwJm7mVULtwhU2xw.jpeg" alt="" width="738" height="591"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So, I am going to execute the command line by line in the command prompt.&lt;/p&gt;

&lt;p&gt;Make sure that your current working directory is should be where you saved your python script and Docker file also.&lt;/p&gt;

&lt;p&gt;Now we first going to copy the first command.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8EWrBsKZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2784/1%2ArA0kZ7HMQMcqmKiGcbVsgg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8EWrBsKZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2784/1%2ArA0kZ7HMQMcqmKiGcbVsgg.jpeg" alt="" width="800" height="32"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And we are getting the “Login Succeeded”. If you are not getting login succeeded please try to configure the AWS CLI again.&lt;/p&gt;

&lt;p&gt;Now I am going for the second command.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WQ80i6SR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2120/1%2A1g0Sabnvm6DouEqSv-GoTg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WQ80i6SR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2120/1%2A1g0Sabnvm6DouEqSv-GoTg.jpeg" alt="" width="800" height="581"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Third command.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--L4Eo2XKL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2222/1%2A0u2Al20a2cAWcgt_dVm57A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--L4Eo2XKL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2222/1%2A0u2Al20a2cAWcgt_dVm57A.png" alt="" width="800" height="35"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now the last command will push your image into repositories.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fBfXiMbN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2566/1%2AKTDST6ZxsFK3acU1no57SQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fBfXiMbN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2566/1%2AKTDST6ZxsFK3acU1no57SQ.jpeg" alt="" width="800" height="161"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will see in the above screenshot that there is an image whose name is ‘latest’.&lt;/p&gt;

&lt;p&gt;Now we will move into AWS Batch, first, we will create “Compute Environments”.&lt;/p&gt;

&lt;p&gt;Click on compute environment on the left-hand side of the AWS Batch dashboard.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FAvKBxRT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A6QfffS3OUk_7BunSIrBJFw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FAvKBxRT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A6QfffS3OUk_7BunSIrBJFw.jpeg" alt="" width="631" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on “Create” on the right-hand side of Compute Environments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lDdtByzX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2618/1%2Aeh3efZ-kl3GteWbPijZA0Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lDdtByzX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2618/1%2Aeh3efZ-kl3GteWbPijZA0Q.png" alt="" width="800" height="135"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fill in the details given below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bQc92LTD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AVJscVAHrw_z-2LT3EuVfFg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bQc92LTD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AVJscVAHrw_z-2LT3EuVfFg.png" alt="" width="758" height="701"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And in the Instance configuration, I am using Spot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bqSG2Whz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AnQK0yKYQKZ_xQ24Gug9VDw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bqSG2Whz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AnQK0yKYQKZ_xQ24Gug9VDw.png" alt="Instance Configuration — 1" width="776" height="679"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8pPkKc-2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2An3sEkjxIR8jlIr1qCVbz-w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8pPkKc-2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2An3sEkjxIR8jlIr1qCVbz-w.png" alt="Instance Configuration — 2" width="707" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For **Networking, **I am using the same VPC Id which I had used in AWS Redshift and the link is here: &lt;a href="https://medium.com/codex/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-b9f6a1fa49f0"&gt;https://medium.com/codex/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-b9f6a1fa49f0&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5y4XPa_j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AYEtJxt3RthKWQYdXsNf15w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5y4XPa_j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AYEtJxt3RthKWQYdXsNf15w.png" alt="" width="726" height="541"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that click on Create Compute Environment. And we will see the status of the computing environment as shown in the image which is valid.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tm3PCASa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2596/1%2A9bHAB9cZW1oaTiL_8abckg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tm3PCASa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2596/1%2A9bHAB9cZW1oaTiL_8abckg.png" alt="" width="800" height="174"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will see the creation of Job Queue and Job Definition in the next article which is part 3.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Run a Python code on AWS Batch Part — 1: Creation of Python Script and Docker Container.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Tue, 19 Jul 2022 07:24:53 +0000</pubDate>
      <link>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1m08</link>
      <guid>https://dev.to/shoeb370/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1m08</guid>
      <description>&lt;h2&gt;
  
  
  Run a Python code on AWS Batch Part — 1: Creation of Python Script and Docker Container.
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QgLcvqAP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aww8k8sMq4_FexQaFg34H5Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QgLcvqAP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aww8k8sMq4_FexQaFg34H5Q.png" alt="" width="548" height="248"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A prerequisite for running python code on AWS Batch is that you already have Python and Docker installed on your system. No matter whether you know Docker or not, our goal is to run the python code in AWS Batch.&lt;/p&gt;

&lt;p&gt;We will start with running the basic python code and that will print messages.&lt;/p&gt;

&lt;p&gt;Steps are given below:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a python script.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Docker File.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create Docker Container.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run Docker Container.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;First, start writing the python script.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
The output of the above code looks like the one given below here.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tYMHiF_L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AcI6Og_8f_ja2TProZ8m4JQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tYMHiF_L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AcI6Og_8f_ja2TProZ8m4JQ.png" alt="" width="461" height="165"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next step is to create a Docker File. In that docker file, we need to insert a particular command which I had given below here. Make sure that the file we are creating is the name should be &lt;strong&gt;“Dockerfile”. *&lt;em&gt;And the python script which I had made the filename I have taken is *&lt;/em&gt;“myfunction.py”.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The command for Docker is given below:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
If you have external packages just like I have taken pandas, same way you can also add the command line which is mentioned on line number 5 in the above code.

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;RUN pip install pandas numpy more-packages-you-can-add-with-whitespace-seperated&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Till now we had created a Docker file and Python script, our next goal is to create a Docker Container.&lt;/p&gt;

&lt;p&gt;Before creating we need to set up our AWS CLI because we need to upload the Docker file and Python script on the AWS platform.&lt;/p&gt;

&lt;p&gt;Make sure you had already installed AWS CLI from these links and click on that links: &lt;a href="https://awscli.amazonaws.com/AWSCLIV2.msi"&gt;https://awscli.amazonaws.com/AWSCLIV2.msi&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we will configure the AWS CLI by running a certain command.&lt;/p&gt;

&lt;p&gt;Make sure you have an AWS ACCESS KEY ID and AWS SECRET KEY by visiting the Security Credential page.&lt;/p&gt;

&lt;p&gt;If you want to generate that key please visit my old article: &lt;a href="https://medium.com/codex/aws-s3-with-python-by-shoeb-ahmed-923c1cf47110"&gt;https://medium.com/codex/aws-s3-with-python-by-shoeb-ahmed-923c1cf47110&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After getting the key let’s jump onto the AWS CLI and Open the command prompt.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open CMD&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type “aws” and it will show some message as shown below that means AWS CLI is installed on your system:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--czgvC9eh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ALNr9hIfMXE3j0Mj7JmxbGA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--czgvC9eh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ALNr9hIfMXE3j0Mj7JmxbGA.png" alt="" width="627" height="216"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now type “aws configure” on the command prompt:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--a2DrtxQ9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A2ylbn1wpvaWyiYHnfPNEnw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--a2DrtxQ9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A2ylbn1wpvaWyiYHnfPNEnw.png" alt="" width="268" height="42"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Enter your access key and press Enter button.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter your secret key and press Enter button.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After that, it will ask for the region and default output format we need to keep as it is blank and presses Enter.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once AWS CLI is done. Now we move to the creation of a container.&lt;/p&gt;

&lt;p&gt;Make sure the docker file and python file are in the same locations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8qO0r58N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AWrLdJ10cS7xV2DApyguOFQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8qO0r58N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AWrLdJ10cS7xV2DApyguOFQ.png" alt="" width="619" height="94"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I open cmd on that location and type:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;docker builds -t sample-aws-code .&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Please give whitespace and dot after “&lt;strong&gt;sample-aws-code&lt;/strong&gt;”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m2CxyEkk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AXFxciCKAgMwGkUnwC5zffQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m2CxyEkk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AXFxciCKAgMwGkUnwC5zffQ.png" alt="" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the above screenshot, we saw that our Image file is built.&lt;/p&gt;

&lt;p&gt;We will test that image file in the local system by running the command:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;docker run sample-aws-code&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qll-dpVx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AP96VZ0fOjMgakUsA93qBAQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qll-dpVx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AP96VZ0fOjMgakUsA93qBAQ.png" alt="" width="534" height="243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So this is the output of the Sample code.&lt;/p&gt;

&lt;p&gt;I will publish part 2 which consists of AWS Batch setup and run.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS Redshift connects with Python Part 2 — Python code to connect AWS Redshift</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Tue, 05 Jul 2022 12:55:22 +0000</pubDate>
      <link>https://dev.to/shoeb370/aws-redshift-connects-with-python-part-2-python-code-to-connect-aws-redshift-1gbd</link>
      <guid>https://dev.to/shoeb370/aws-redshift-connects-with-python-part-2-python-code-to-connect-aws-redshift-1gbd</guid>
      <description>&lt;h2&gt;
  
  
  AWS Redshift connects with Python Part 2 — Python code to connect AWS Redshift
&lt;/h2&gt;

&lt;p&gt;In the Part 1 session, we set up the AWS Redshift account to connect to the outer world.&lt;/p&gt;

&lt;p&gt;In this session, I will demonstrate some of the code that will be set up and connected to a python world to AWS Redshift.&lt;/p&gt;

&lt;p&gt;And my main goal is to upload a pandas DataFrame in AWS Redshift.&lt;/p&gt;

&lt;p&gt;Before going to the coding part let's bump into the Query Editor V2.&lt;/p&gt;

&lt;p&gt;First logged in AWS Redshift Session or we can go to Redshift Dashboard.&lt;/p&gt;

&lt;p&gt;After visiting the Dashboard page we will see the list of clusters on the CLuster Overview Session.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TEJq_y2z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2622/1%2AXCMN7ccEgUPpQwZ8mKaMQA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TEJq_y2z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2622/1%2AXCMN7ccEgUPpQwZ8mKaMQA.jpeg" alt="" width="800" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we see that I have 2 clusters created one is active which is “Available” and another one is inactive which is “Paused”. I am going to use redshift-cluster-3.&lt;/p&gt;

&lt;p&gt;I will also show you how to activate or we can say how to make redshift-cluster-2 Available.&lt;/p&gt;

&lt;p&gt;Click “redshift-cluster-2” name or your cluster name, and it will redirect to the cluster page, click on “Action”, and it will show a drop-down menu.&lt;/p&gt;

&lt;p&gt;Inside the menu, you will see “Resume”, please click on that. After clicking it takes some more minutes to activate the cluster. So, you will see the status of the cluster on the AWS Redshift cluster dashboard and the message will be Resume Modifying and after some, it will become Available.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UPhSlkU0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2374/1%2A4eXTPWnRM1-TkEK3adz4MA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UPhSlkU0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2374/1%2A4eXTPWnRM1-TkEK3adz4MA.jpeg" alt="" width="800" height="282"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once our cluster is “Available”, now, let's start working with python code.&lt;/p&gt;

&lt;p&gt;We are going to install a redshift connector by running the command given below.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;pip install redshift-connector&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For connecting AWS Redshift via python we require certain credentials like Hostname, Database Name, Port Number, User and Password.&lt;/p&gt;

&lt;p&gt;If we are inside our cluster side, we go to Properties, we will see the credentials we are looking for.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MsYD_G54--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2676/1%2ACt0_du7js8O3DiZRLnvPbg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MsYD_G54--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2676/1%2ACt0_du7js8O3DiZRLnvPbg.jpeg" alt="" width="800" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are going to run the query in python to create a table on Redshift.&lt;/p&gt;

&lt;p&gt;Before that let's create a sample DataFrames&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
Now we are going to create a table inside the AWS Redshift cluster.&lt;br&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
We will see the table is created.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xGTq7V0X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AKOmoATIo4hi2-NpUjtlg9Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xGTq7V0X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AKOmoATIo4hi2-NpUjtlg9Q.jpeg" alt="" width="375" height="488"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are going to insert DataFrame inside the table.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
Now in the query editor, we will see that our table is created and filled with data inside the table.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4B6LOQFU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A2bQjBXdJJCsygP-_85NMNQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4B6LOQFU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A2bQjBXdJJCsygP-_85NMNQ.png" alt="" width="407" height="117"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This whole sample demonstrates how we can create an AWS Redshift cluster and how can I insert pandas DataFrames.&lt;/p&gt;

&lt;p&gt;We can upload n number of Data inside DataFrame in Redshift cluster.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Web Scraping the Phillips India website — Scraping the Headphone Category.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Mon, 04 Jul 2022 11:15:07 +0000</pubDate>
      <link>https://dev.to/shoeb370/web-scraping-the-phillips-india-website-scraping-the-headphone-category-59j9</link>
      <guid>https://dev.to/shoeb370/web-scraping-the-phillips-india-website-scraping-the-headphone-category-59j9</guid>
      <description>&lt;h2&gt;
  
  
  Web Scraping the Phillips India website — Scraping the Headphone Category.
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rKXDh_N6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/8304/0%2A7ZI0nhI0bICoK6li" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rKXDh_N6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/8304/0%2A7ZI0nhI0bICoK6li" alt="Photo by [Iwona Castiello d'Antonio](https://unsplash.com/@aquadrata?utm_source=medium&amp;amp;utm_medium=referral) on [Unsplash](https://unsplash.com?utm_source=medium&amp;amp;utm_medium=referral)" width="800" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here I will scrape all the content on the Phillips website. And that content will be the Headphones category. The website is Phillips's India website.&lt;/p&gt;

&lt;p&gt;Usually, web scraping is part of the Data Science/ Data Analyst job, which is the Data Generation or Data Aggregator.&lt;/p&gt;

&lt;p&gt;Here we are going step by step approach, to how we do scraping the website, and the technology we are going to use is, Python, Selenium and Beautiful Soup.&lt;/p&gt;

&lt;p&gt;First thing is that we are going to install some packages, which we type the certain command on cmd or anaconda prompt.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;pip install lxml&lt;br&gt;
 pip install selenium&lt;br&gt;
 pip install beautifulsoup4&lt;br&gt;
 pip install pandas&lt;br&gt;
 pip install webdriver-manager&lt;br&gt;
 pip install regex&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gq9XcBvT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AdHicG7rF6tV5vVtjFgX0eQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gq9XcBvT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AdHicG7rF6tV5vVtjFgX0eQ.png" alt="" width="800" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6E-qcXlv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2146/1%2AON-EFSuptsQpeZ4gls3moQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6E-qcXlv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2146/1%2AON-EFSuptsQpeZ4gls3moQ.png" alt="" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, we are going to import the packages we required.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
What I will do, I visit directly to that website which contains all the headphones and which contain all the types of headphones. So that is easy for us to try not to do much to automate the website.

&lt;p&gt;I have mentioned below that how our website looks and you can visit to have look at how this website looks for you. &lt;a href="https://www.philips.co.in/c-m-so/headphones/latest#"&gt;https://www.philips.co.in/c-m-so/headphones/latest#&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VbF0QQJ7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A59XMWKyzsBZzVxIJeQXvEw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VbF0QQJ7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A59XMWKyzsBZzVxIJeQXvEw.png" alt="" width="684" height="629"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, what we do will scrape all the data present inside the small container or we can say a small box. I am going to extract the Model number, Category name, and Overall Ratings.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
Line number 19 on the above code is to collect the links of the product. By collecting the links of the product to visit the product website and to collect the data like technical specifications in detailed format.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jczs4qMq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2004/1%2AluRKV0-aObzGwxoGpqDEnA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jczs4qMq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2004/1%2AluRKV0-aObzGwxoGpqDEnA.png" alt="" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cpxq3jRt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Ask_1jit5T5OK47BEx4HbxQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cpxq3jRt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Ask_1jit5T5OK47BEx4HbxQ.png" alt="" width="742" height="621"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MtSCEzpQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aq4mQoMJoMG3Ya_fD4cNclQ.png" alt="" width="739" height="475"&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pUiKr_Es--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A8pyiLmRim82wUi4vE5WesA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pUiKr_Es--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A8pyiLmRim82wUi4vE5WesA.png" alt="" width="789" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now finally we scrape all the data, and end result is that we are going to save the data in CSV format.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---L1vN7st--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AMc7BVNfJLf12y7fxoEMqCQ.png" alt="" width="479" height="258"&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---lYKXOyS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2638/1%2AsHfZzSuj0LlwLFFUFkR9BQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---lYKXOyS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2638/1%2AsHfZzSuj0LlwLFFUFkR9BQ.png" alt="" width="800" height="311"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside data look like the above image.&lt;/p&gt;

&lt;p&gt;That file will be saved as “Final_Headphones.csv”&lt;/p&gt;

&lt;p&gt;The entire code is shown below:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
That’s all about scraping the data.

&lt;p&gt;Thank you.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS Redshift connects with Python Part 1 — Setup a Redshift connection with Python.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Mon, 04 Jul 2022 11:13:41 +0000</pubDate>
      <link>https://dev.to/shoeb370/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-2k98</link>
      <guid>https://dev.to/shoeb370/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-2k98</guid>
      <description>&lt;h2&gt;
  
  
  AWS Redshift connects with Python Part 1 — Setup a Redshift connection with Python.
&lt;/h2&gt;

&lt;p&gt;Welcome to the &lt;em&gt;Amazon Redshift Cluster Management Guide&lt;/em&gt;. Amazon Redshift is a fully managed, petabyte-scale data warehouse service in the cloud. You can start with just a few hundred gigabytes of data and scale to a petabyte or more.&lt;/p&gt;

&lt;p&gt;You can get more details on the websites: &lt;a href="https://docs.aws.amazon.com/redshift/latest/mgmt/welcome.html"&gt;https://docs.aws.amazon.com/redshift/latest/mgmt/welcome.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We are going to set up from the AWS side by step by step approach.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PRVB3S73--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AkAV5E6ENslE-8BvKxeQI9Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PRVB3S73--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AkAV5E6ENslE-8BvKxeQI9Q.png" alt="" width="456" height="104"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope you guys have created the AWS account, we logged in to the AWS Redshift account. The page looks like shown below here.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JoUEzCnB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2188/1%2AmhnjKIBPcqtIlG6UDnG4Ug.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JoUEzCnB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2188/1%2AmhnjKIBPcqtIlG6UDnG4Ug.jpeg" alt="" width="800" height="486"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Above page which is the AWS Redshift page, on that page their is an option called “Create cluster”. We have to click on it and proceed to further.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5ojdW4at--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A5zbm3mtJlHjixU3BgECKYQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5ojdW4at--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A5zbm3mtJlHjixU3BgECKYQ.jpeg" alt="" width="772" height="717"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking on “Create Cluster” we need to give the name Cluster identifier, I have given “redshift-cluster-v1” and after that clicking on the Free trial option. The red colour box I have shown above is on the right-hand side.&lt;/p&gt;

&lt;p&gt;After that, we scroll down to see that we have to give the Database name, port number and password. But I chose to keep the default as it is. But we have to give our password.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I4o1XTTV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AyPaZ4oVBySm-RYZMfHoV6Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I4o1XTTV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AyPaZ4oVBySm-RYZMfHoV6Q.jpeg" alt="" width="755" height="669"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After entering the Admin user name and password, click on Create cluster.&lt;/p&gt;

&lt;p&gt;After clicking “Create cluster” the page will automatically navigate to the cluster homepage and it will show the status of your cluster and it takes time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zntmhWyL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3014/1%2ASad4NAtf5b9n3l44S3w6Rw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zntmhWyL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3014/1%2ASad4NAtf5b9n3l44S3w6Rw.jpeg" alt="" width="800" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The above image shows the status of the cluster, and the status is “Modifying Creating”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EepCjtik--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2952/1%2AymavhlE2_RWjvoQULBsVcw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EepCjtik--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2952/1%2AymavhlE2_RWjvoQULBsVcw.jpeg" alt="" width="800" height="104"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the cluster has been created, the status will show “Available”.&lt;/p&gt;

&lt;p&gt;Now our cluster is created with some demo data inside the databases.&lt;/p&gt;

&lt;p&gt;After that, we have to make changes in the VPC Security group to communicate between Python and redshift.&lt;/p&gt;

&lt;p&gt;First, we must click our redshift cluster name, go down and click on the “Properties” tab.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qQycDjyz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3084/1%2AoZrqCTk7DKp7Z-A189-tzQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qQycDjyz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3084/1%2AoZrqCTk7DKp7Z-A189-tzQ.jpeg" alt="" width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside the “Properties” tab we will see, “Network and Security settings”, click on the links which are present below “VPC security group” which is in the blue box mentioned in the below diagram.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1DWxHV3l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3038/1%2AasXqLR_DNSQtQDKIfg_esw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1DWxHV3l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3038/1%2AasXqLR_DNSQtQDKIfg_esw.jpeg" alt="" width="800" height="146"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking that blue link we will redirect to the Security Group page. And we have to search for “Inbound rules”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Hn0FhBi4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2720/1%2A5O4RUSAHQIHaJzBbeXTdFg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Hn0FhBi4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2720/1%2A5O4RUSAHQIHaJzBbeXTdFg.jpeg" alt="" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And Inbound rules are present at bottom of the screen, the green box which I had shown in the above image.&lt;/p&gt;

&lt;p&gt;After clicking the “Inbound rules” we will see the list of various inbound rules and we click on “Edit Inbound rules”. And we will change some of the properties.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_nNtYsse--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2616/1%2AlXnCFTEwp9KJfkW461Br2Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_nNtYsse--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2616/1%2AlXnCFTEwp9KJfkW461Br2Q.jpeg" alt="" width="800" height="133"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that click on the “Add rules”, which will prompt new data and we need to modify it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jt3suFTD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3028/1%2ANnL6pAF_Fu7i91H4ujbdDQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jt3suFTD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3028/1%2ANnL6pAF_Fu7i91H4ujbdDQ.jpeg" alt="" width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hMRtLDk4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3020/1%2A0qLLVW7N4m1tHWAt3LJqRw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hMRtLDk4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3020/1%2A0qLLVW7N4m1tHWAt3LJqRw.jpeg" alt="" width="800" height="274"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We need to modify the fields mentioned below Image and which are inside the green box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_TX7T5-R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AU6nGjM-iDQf_wBOKFRkKXg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_TX7T5-R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AU6nGjM-iDQf_wBOKFRkKXg.jpeg" alt="" width="747" height="202"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By setting the VPC security group, we will connect AWS Redshift to python.&lt;/p&gt;

&lt;p&gt;Now, only one thing remaining is that we have to make our cluster publicly accessible. So, we come to cluster home and click on the check box of the cluster.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--t89vY-on--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2526/1%2AW9r6P8TFpU4XFUkms1KBBg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--t89vY-on--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2526/1%2AW9r6P8TFpU4XFUkms1KBBg.jpeg" alt="" width="800" height="303"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After done selecting your choice of the cluster, then go to Action, locate Modify publicly accessible setting. A prompt window will arise and we have to make it enable.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BhEge7oi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ARkHDTKbX6p36lq2Oq10KYg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BhEge7oi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ARkHDTKbX6p36lq2Oq10KYg.jpeg" alt="" width="561" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click “Enable” and save changes. Now our Redshift is ready for python.&lt;/p&gt;

&lt;p&gt;I will publish the code of python in the next article part 2.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>My Coding Vlog — Day 1</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Thu, 16 Jun 2022 07:20:54 +0000</pubDate>
      <link>https://dev.to/shoeb370/my-coding-vlog-day-1-3pnb</link>
      <guid>https://dev.to/shoeb370/my-coding-vlog-day-1-3pnb</guid>
      <description>&lt;h2&gt;
  
  
  My Coding Vlog — Day 1
&lt;/h2&gt;

&lt;p&gt;Here I am going to share my experience as a coding vlog.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/BexDAEtjKNg"&gt;https://youtu.be/BexDAEtjKNg&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have shared a youtube link which contains the code output.&lt;/p&gt;

&lt;p&gt;That video or I can say that I have again started with the basics of python and the famous books Automate the Boring Stuff with Python: Practical Programming for Total Beginners&lt;br&gt;
&lt;a href="https://github.com/shoeb370/my_coding_vlog/blob/main/day_01.ipynb"&gt;&lt;strong&gt;my_coding_vlog/day_01.ipynb at main · shoeb370/my_coding_vlog&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS S3 with python by Shoeb Ahmed</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Thu, 16 Jun 2022 07:18:59 +0000</pubDate>
      <link>https://dev.to/shoeb370/aws-s3-with-python-by-shoeb-ahmed-1nm9</link>
      <guid>https://dev.to/shoeb370/aws-s3-with-python-by-shoeb-ahmed-1nm9</guid>
      <description>&lt;h2&gt;
  
  
  AWS S3 with python by Shoeb Ahmed
&lt;/h2&gt;

&lt;p&gt;Amazon Web Services is an online platform that provides scalable and cost-effective cloud computing solutions.&lt;/p&gt;

&lt;p&gt;Here we are going to explore all the methods (functions) present in boto3 packages, and we are going to S3 bucket resources.&lt;/p&gt;

&lt;p&gt;And I am going to skip lots of theoretical concepts and jump step by step.&lt;/p&gt;

&lt;p&gt;Make sure we have an AWS account and a root user.&lt;/p&gt;

&lt;p&gt;Once you logged in, we just need an Access key to get a connection from AWS and python.&lt;/p&gt;

&lt;p&gt;What we need to do, is go to the home console and see on the top right corner, click on your username as shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YnRjo3kv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3150/1%2AUoSccDQH5_2qMb8d9eb3iw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YnRjo3kv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/3150/1%2AUoSccDQH5_2qMb8d9eb3iw.png" alt="" width="800" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking the drop-down click on security credentials.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3mwbbo6E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ARl9HznTAgavPSNn_qCZ3GA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3mwbbo6E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2ARl9HznTAgavPSNn_qCZ3GA.png" alt="" width="316" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking on security credentials, we will redirect to the security page, and click on the Access Keys.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--d-neow1x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2554/1%2AXfZugyhEKm_AGGtO-j-DeQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--d-neow1x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2554/1%2AXfZugyhEKm_AGGtO-j-DeQ.png" alt="" width="800" height="283"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After clicking the Access keys, click on “Create New Access Key”, and a pop-up window comes up, and click on “Download key files”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GxVjbD6r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2480/1%2Aj1aG8fn6xSemvKWqEzvpLw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GxVjbD6r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2480/1%2Aj1aG8fn6xSemvKWqEzvpLw.png" alt="" width="800" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Excel sheet file will be downloaded, what we have to do is that we are going to copy the AWS Access Key ID and AWS Secret Key ID.&lt;/p&gt;

&lt;p&gt;My goal is to explore all the functions (methods) in S3 Bucket.&lt;/p&gt;

&lt;p&gt;Let’s start coding.&lt;/p&gt;

&lt;p&gt;And though I am going to do any task with Pandas DataFrame only. If you want any other file format, please go to my profile and find my email id and message me or write a comment.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
After running the code you will observe that your new bucket is created in S3.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6TOoPwY6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aw5Ivzae15b0QWoA-JkCRPA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6TOoPwY6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2Aw5Ivzae15b0QWoA-JkCRPA.png" alt="" width="800" height="42"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now what we are going to do, we are going to upload the csv file in S3 bucket.&lt;/p&gt;

&lt;p&gt;First we will create a simple DataFrame&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BTY0Yw6c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AOnzrzKba_8-0q-PN8kMk8w.png" alt="" width="355" height="122"&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--a8z4-LcM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AM1DTK2j1NNx8DjVQh7mBlg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--a8z4-LcM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AM1DTK2j1NNx8DjVQh7mBlg.png" alt="" width="614" height="32"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are going to upload a CSV file which is stored in the local system, without creating a folder inside the bucket.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PlpOWiyx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AotPF5UIaqWui2TABTv0ZMw.png" alt="" width="800" height="439"&gt;

&lt;p&gt;Now we are going to create a folder inside the bucket, inside that folder we are going to upload the code.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qrIRVThg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2054/1%2AJDIMqcYgRmOcm2szq0PHbA.png" alt="" width="800" height="397"&gt;

&lt;p&gt;If you have a bytes-like file (i.e. binary file) or stream file so we can use uploadfileobj functions.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BBj5wHuY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2AMk6EE15bUblrsJ8bqGQbjA.png" alt="" width="800" height="88"&gt;

&lt;p&gt;We can also upload by using the PutObject request, you must have the s3:PutObject in your IAM permissions.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;br&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0NLreX4_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A24qmx1dL_1al1b4Zq8zZUg.png" alt="" width="800" height="83"&gt;

</description>
    </item>
    <item>
      <title>Getting NSE data using python.</title>
      <dc:creator>Shoeb Ahmed</dc:creator>
      <pubDate>Tue, 22 Mar 2022 17:20:59 +0000</pubDate>
      <link>https://dev.to/shoeb370/getting-nse-data-using-python-mce</link>
      <guid>https://dev.to/shoeb370/getting-nse-data-using-python-mce</guid>
      <description>&lt;h2&gt;
  
  
  Getting NSE data using python.
&lt;/h2&gt;

&lt;p&gt;If anyone like coding as a hobby or they like to explore more using code, python is the best example from my point of view.&lt;/p&gt;

&lt;p&gt;I am going to demonstrate how can we extract simple stock market data using python. So, we are going to extract the &lt;strong&gt;NSE&lt;/strong&gt; (&lt;strong&gt;National Stock Exchange of India Ltd&lt;/strong&gt;). So this is just a sample code of how can you use it in your application or you can explore more from this blog.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install nsepy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;Now I am going to extract the share of “&lt;strong&gt;ICICI Bank&lt;/strong&gt;” data of 30 days ago of the current date.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;

&lt;p&gt;&lt;br&gt;&lt;br&gt;
After running the above code you will get the data in CSV format as shown below but the date and data will be variant.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kl_jmH0Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A0zxs6-RL3QXFgxTOYWjbhA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kl_jmH0Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/1%2A0zxs6-RL3QXFgxTOYWjbhA.png" alt="" width="800" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can explore more on this topic on: &lt;a href="https://nsepy.xyz/"&gt;https://nsepy.xyz/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Any queries drop me a question on my LinkedIn Account page.&lt;br&gt;
&lt;a href="https://www.linkedin.com/in/shoeb-ahmed-97211211a/"&gt;&lt;strong&gt;Shoeb Ahmed - Trainee Software Engineer - Aress Software | LinkedIn&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now for option and Bank Nifty&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stock_opt = get_history(symbol="SBIN",
                        start=date(2015,1,1),
                        end=date(2015,1,10),
                        option_type="CE",
                        strike_price=300,
                        expiry_date=date(2015,1,29))

nifty_next50 = get_history(symbol="NIFTY NEXT 50",
                            start=date(2015,1,1),
                            end=date(2015,1,10),
                            index=True)

Bank_Nifty = get_history(symbol="BANKNIFTY",
                            start=date(2015,1,1),
                            end=date(2015,1,10),
                            index=True)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
    </item>
  </channel>
</rss>
