DEV Community

Cover image for Creating a CI/CD pipeline for a Python3 Lambda Function through AWS CloudFormation
vahdet
vahdet

Posted on

Creating a CI/CD pipeline for a Python3 Lambda Function through AWS CloudFormation

Creating a AWS Lambda function is often as straightforward as selecting the language and pasting your code in the console. However, you'd still make use of a version control system if you're not simply playing around with the functions but rather plan to run them with serious purposes -just like any good-old server applications.

I would not surprise you if I know you prefer using Git for that purpose. Taking one step further, I also know you're probably using Github as your remote codebase. So far so good, in the following lines we are going to make your code published as a Lambda function every time you commit to your preferred Git branch.

Matrix Reloaded source room with a wall of screens

Part #1: Preparing Source Code Repo

We are going to discuss the content of a desired source code repo for our function. You can read through or check the sample repo here:

https://github.com/vahdet/python-lambda-template

As the name suggests, it expects some code in Python. The version for it is 3.8.

Write Your Code

The default Python Lambda function in AWS consists of a single file named lambda_function.py and it has a single function lambda_handler. Assuring we update the handler field in template.yaml we can choose any name for both, but for the sake of simplicity we are going to stick to the defaults.

So, basically, add your repo a file named lambda_function.py and initialize its content like the console-generated one:

# lambda_function.py
""" Lambda function module """
import os
import json
import boto3

def lambda_handler(event, context):
    """ Handles function event and context """

    # resolve backend api key from the secrets manager
    sm_client = boto3.client('secretsmanager')
    sm_resp = sm_client.get_secret_value(os.getenv('BACKEND_SERVICE_API_KEY_SECRET_ARN'))
    backend_api_key = json.dumps(sm_resp.get('SecretString')).get('key')

    # TODO implement further business logic
    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!')
    }
Enter fullscreen mode Exit fullscreen mode

So, writing down all your business logic here is not an urgent matter. Enjoy implementing right away or defer it. Here are some point you may consider:

  • You can add other functions, closures and call them in our main handler function here (i.e. lambda_handler)

  • The two arguments of the main function event and context. To have a wider understanding of their contents, please refer to the official docs

List Your Required Libraries in requirements.txt

The more you write your stuff, the more likely you will need some third-party libraries.

When developing local scripts you can simply run

pip install somelibrary

However, as the number of them grows, it becomes a burden keeping track of the libs. That's when you consider using some collection, like a file called requirements.txt. We also adopted this approach for centralizing the dependencies and fetching them before bundling our code.

The buildpec.yaml file

By bundling I mean bundling. Really! Generally, we make use of Python as an interpreted language in our daily lives. So, compiling into an executable out of our Python project is not generally the case. But we are expected to collect the dependencies of our code, put it in a deployment package and then push the result as a Lambda function -we cannot give requirements list to Lambda and want it to download.

So, we keep those preparation logic in a file called buildspec.yaml. Just like lambda_function.py and requirements.txt, it must be located in the root level of the repo.

The name buildspec is not an outcome of our fancy marketing/brainstorming sessions but a default convention from AWS CodeBuild. We are going to dig into CodeBuild when we are defining the pipeline, but we should just be aware that a buildspec.yml has pre-defined keys & syntax and something like follows will help us collect out requirements and state that an aws-cli command (package) will be used to package (or bundle :) our project:

# buildspec.yml
version: 0.2

phases:
  install:
    commands:
      # Install/upgrade pip and AWS CLI 
      - pip install --upgrade pip awscli
      # Install the packages required
      - pip install -r requirements.txt -t .
  build:
    commands:
      # LAMBDA_ARTIFACT_STORE_BUCKET should be an environment variable in AWS::CodeBuild::Project
      - aws cloudformation package --s3-bucket $LAMBDA_ARTIFACT_STORE_BUCKET --template-file template.yaml --output-template-file output-template.yaml

artifacts:
  type: zip
  files:
    - template.yaml 
    - output-template.yaml
Enter fullscreen mode Exit fullscreen mode

Here, LAMBDA_ARTIFACT_STORE_BUCKET is defined as an environment variable. To see where it is defined, we are again bound to wait for our CI/CD pipeline section.

The template.yaml file

The aws cloudformation package command in the previous section takes an argument like --template-file. Now, we are going to create a file to act as a CloudFormation template for the function itself (not the pipeline yet). You can check AWS::Serverless::Function documentation to deep diver, but with this following example we can declare our function will:

  • have permission to logs,

  • make use of an environment variable called BACKEND_SERVICE_BASE_URL. It is just a dummy value to depict how environment variables are specified when creating a Lambda function not by console, by local AWS CLI commands, but through Cloudformation.

AWSTemplateFormatVersion : "2010-09-09"
Transform: AWS::Serverless-2016-10-31
Description: A sample SAM template for deploying Lambda functions.
Parameters:
  FunctionName:
    Type: String
  BackendServiceBaseURL:
    Type: String
  BackendServiceApiKeySecretArn:
    Type: String
  Environment:
    Type: String
  Application:
    Type: String
Resources:
  MyFunction:
    Type: AWS::Serverless::Function
    Properties:
      FunctionName: !Ref FunctionName
      CodeUri: ./
      Runtime: python3.8
      Handler: lambda_function.lambda_handler
      Environment:
        Variables:
          BACKEND_SERVICE_BASE_URL: !Ref BackendServiceBaseURL
          BACKEND_SERVICE_API_KEY_SECRET_ARN: !Ref BackendServiceApiKeySecretArn
      Policies:
        - Version: "2012-10-17"
          Statement:
            - Effect: Allow
              Action:
                - "logs:*"
                - "secretsmanager:GetSecretValue"
              Resource: "*" 
      Tags:
        Environment: !Ref Environment
        Application: !Ref Application
Enter fullscreen mode Exit fullscreen mode

The Parameters will make more sense when we head for CI/CD pipeline. Which is, indeed, the next!

A stack of files. Blurred.

Part #2: Preparing Cloudformation templates

We are going to use nested stacks to have a slightly easier to maintain structure.

The templates can be found here https://github.com/vahdet/func-cicd-pipeline-cfn-template . Again, you can fork it or copy the following functional ones in it:

  • main.yaml (parent)

  • iam.yaml (child)

  • storage.yaml (child)

  • pipeline.yaml (child)

Prework

Create a S3 Bucket via the console (https://aws.amazon.com/console/). Make sure it is in the same region you want your resources and the Lambda function reside.

Upload those four files to this bucket.

Note down the name of the bucket, we are going to use it in the path to find the child stacks (Yes, from the inside, Cloudformation is not aware of the bucket it gets the stack template files from).

Create stack

  • Navigate to Cloudformation page in AWS console.

  • Create stack selecting the full URL to the main.yaml (e.g. https://${TemplateS3BucketName}.s3.${AwsRegion}.amazonaws.com/main.yaml) in your S3 bucket in the previous section.

Part #3: A CI/CD pipeline built with Cloudformation for the Cloudformation templates of the function 🐍🐍🐍

Nah, We are not building an Ouroboros here. But, if you know a wise way to publish our Cloudformation templates to their S3 buckets; do not hesitate to post it.

Cheers!

Top comments (0)