DEV Community

Shoeb Ahmed
Shoeb Ahmed

Posted on

Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.

Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.

In the previous article, I posted that how can we create and run a simple python script. And how to create a simple Docker File and Docker container that runs on a local system.

AWS Batch + Python

In this article, we are going for the AWS Batch side. For running an AWS Batch we require an Image File (Docker Container) and Computing Environment.

  1. Create a repository in “Amazon Elastic Container Registry

  2. Push the Docker Container into the repository.

  3. Creating a “Compute Environment” in the AWS batch.

For running a Docker Container on AWS Batch we need to store it on the AWS platform, we are going to use **“Amazon Elastic Container Registry”, **where we can store our Docker Container. Search **Amazon Elastic Container Registry **and we see the dashboard of ECR.

Click “Get Started”. After that, we will visit to create a repository page.

I am going to use public because we are in the learning phase.

And give the name of the repository which is a compulsory field.

I am selecting all **“Content types” **and you can choose your content types.

And then I am going to click **“Create repository”. **After that our page will redirect to the repository list their we can see our new repository list.

Now we click on our repository name and after that, we will click on “View push commands”.

After clicking on that button we will see a list of push commands we need to follow all the commands.

So, I am going to execute the command line by line in the command prompt.

Make sure that your current working directory is should be where you saved your python script and Docker file also.

Now we first going to copy the first command.

And we are getting the “Login Succeeded”. If you are not getting login succeeded please try to configure the AWS CLI again.

Now I am going for the second command.

Third command.

Now the last command will push your image into repositories.

We will see in the above screenshot that there is an image whose name is ‘latest’.

Now we will move into AWS Batch, first, we will create “Compute Environments”.

Click on compute environment on the left-hand side of the AWS Batch dashboard.

Click on “Create” on the right-hand side of Compute Environments.

Fill in the details given below:

And in the Instance configuration, I am using Spot.

Instance Configuration — 1

Instance Configuration — 2

For **Networking, **I am using the same VPC Id which I had used in AWS Redshift and the link is here: https://medium.com/codex/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-b9f6a1fa49f0

After that click on Create Compute Environment. And we will see the status of the computing environment as shown in the image which is valid.

We will see the creation of Job Queue and Job Definition in the next article which is part 3.

Top comments (0)